To see the other types of publications on this topic, follow the link: MD5.

Dissertations / Theses on the topic 'MD5'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'MD5.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Sridharan, Prathap. "A survey of the attack on MD5." College Park, Md. : University of Maryland, 2006. http://hdl.handle.net/1903/3689.

Full text
Abstract:
Thesis (M.S.) -- University of Maryland, College Park, 2006.
Thesis research directed by: Applied Mathematics and Scientific Computation Program. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO, and other styles
2

Legendre, Florian. "Exploitation de la logique propositionnelle pour la résolution parallèle des problèmes cryptographiques." Thesis, Reims, 2014. http://www.theses.fr/2014REIMS006/document.

Full text
Abstract:
La démocratisation des ordinateurs, des téléphones portables et surtout de l'Internet a considérablement révolutionné le monde de la communication. Les besoins en matière de cryptographie sont donc plus nombreux et la nécessité de vérifier la sûreté des algorithmes de chiffrement est vitale. Cette thèse s'intéresse à l'étude d'une nouvelle cryptanalyse, appelée cryptanalyse logique, qui repose sur l'utilisation de la logique propositionnelle - à travers le problème de satisfaisabilité - pour exprimer et résoudre des problèmes cryptographiques. Plus particulièrement, les travaux présentés ici portent sur une certaine catégorie de chiffrements utilisés dans les protocoles d'authentification et d'intégrité de données qu'on appelle fonctions de hachage cryptographiques. Un premier point concerne l'aspect modélisation d'un problème cryptographique en un problème de satisfaisabilité et sa simplification logique. Sont ensuite présentées plusieurs façons pour utiliser cette modélisation fine, dont un raisonnement probabiliste sur les données du problème qui permet, entres autres, d'améliorer les deux principaux points d'une attaque par cryptanalyse logique, à savoir la modélisation et la résolution. Un second point traite des attaques menées en pratique. Dans ce cadre, la recherche de pré-Image pour les fonctions de hachage les plus couramment utilisées mènent à repousser les limites de la résistance de ces fonctions à la cryptanalyse logique. À cela s'ajoute plusieurs attaques pour la recherche de collisions dans le cadre de la logique propositionnelle
Democratization of increasingly high-Performance digital technologies and especially the Internet has considerably changed the world of communication. Consequently, needs in cryptography are more and more numerous and the necessity of verifying the security of cipher algorithms is essential.This thesis deals with a new cryptanalysis, called logical cryptanalysis, which is based on the use of logical formalism to express and solve cryptographic problems. More precisely, works presented here focuses on a particular category of ciphers, called cryptographic hash functions, used in authentication and data integrity protocols.Logical cryptanalysis is a specific algebraic cryptanalysis where the expression of the cryptographic problem is done through the satisfiabilty problem, fluently called sat problem. It consists in a combinatorial problem of decision which is central in complexity theory. In the past years, works led by the scientific community have allowed to develop efficient solvers for industrial and academical problems.Works presented in this thesis are the fruit of an exploration between satisfiability and cryptanalysis, and have enabled to display new results and innovative methods to weaken cryptographic functions.The first contribution is the modeling of a cryptographic problem as a sat problem. For this, we present some rules that lead to describe easily basic operations involved in cipher algorithms. Then, a section is dedicated to logical reasoning in order to simplify the produced sat formulas and show how satisfiability can help to enrich a knowledge on a studied problem. Furthermore, we also present many points of view to use our smooth modeling to apply a probabilistic reasoning on all the data associated with the generated sat formulas. This has then allowed to improve both the modeling and the solving of the problem and underlined a weakness about the use of round constants.Second, a section is devoted to practical attacks. Within this framework, we tackled preimages of the most popular cryptographic hash functions. Moreover, the collision problem is also approached in different ways, and particularly, the one-Bloc collision attack of Stevens on MD5 was translated within a logical context. It's interesting to remark that in both cases, logical cryptanalysis takes a new look on the considered problems
APA, Harvard, Vancouver, ISO, and other styles
3

Ващенко, К. А. "Розробка комп’ютерної системи конфіденційної передачі повідомлень." Thesis, Чернігів, 2020. http://ir.stu.cn.ua/123456789/23440.

Full text
Abstract:
Ващенко, К. А. Розробка комп’ютерної системи конфіденційної передачі повідомлень : випускна кваліфікаційна робота : 123 "Комп’ютерна інженерія" / К. А. Ващенко ; керівник роботи Є. В. Риндич ; НУ "Чернігівська політехніка", кафедра інформаційних та комп’ютерних систем. – Чернігів, 2020. – 71 с.
Об’єктом розробки дипломної роботи є комп’ютерна система конфіденційної передачі повідомлень, що включає в себе бібліотеку класів для розробки додатку та сервер, на якому зберігається потрібна інформація для шифрування та для знаходження користувачів один одними. Серверна частина розробляється на мові java й буде запускатися на ОС Windows, бібліотека для роботи з сервером також написана на java і її можна використовувати як і на мобільному пристрої, так і на комп’ютері. В проекті використовуються бібліотеки Gson, postgresql та популярні технології безбечної передачі інформації RSA-шифрування і digest-авторизація. Ціллю проведеної роботи – розробити сервер, який буде зберігати важливу інформацію про користувача, включаючи публічний ключ шифрування, який знадобиться для спілкування між кінцевими пристроями. Розподілити дані, що зберігає сервер, на тимчасові та постійні. Вибрати зручну базу данних (БД) для зберігання постійних данних. Зробити безпечне збереження паролів користувача в БД, щоб зловмисник, який отримає доступ до БД, не міг узнати чужих паролів та видати себе за іншого користувача. Надати зручний спосіб для роботи з сервером у вигляді бібліотеки. Поставлені наступні основні задачі: - Вибір популярної та зручної мови програмування; - Знайомство з принципами шифрування, безпечного зберігання конфіденційної інформації, знайомство з роботою з базою даних; - Налаштувати зручну БД; - Реалізувати проект; В ході виконання роботи були розроблені : - Сервер; - БД; - Бібліотека для звернення до серверу.
The object of thesis development is a computer system for safe message transfer that includes a library of classes for app development and server that contains useful user information for encrypting messages and for finding each other. The server part is being developed in java language and will be launching in Windows OS, library for working with server is written in java too and programmer can use this library for developing computer app and mobile app too. Project uses Gson, postgres libraries and popular technologies for safe information transfer: RSA-encrypting and digest-authorization. The purpose of this work is to develop a server, which will save important information about user, inculding public key for encrypting too that will be used by end-to-end peers. To part data, that server saves, on terminal and persistant. To make safe saving of user passwords in database(DB), because of malefactor who can get access to database and then he can to pretend to be other man. To deliver convinient interface for working with server. The following main tasks are set: Choosing the popular and convinient programming language ; Familiarity with the encrypting theory and principles, with ways of saving confidential information safe, perusing theory for working with DB using chosen programming language; Configuring DB; Writing project; In the course of the work were developed : Server; Database; Library for requesting a server;
APA, Harvard, Vancouver, ISO, and other styles
4

Hajný, Jan. "Návrh řešení autentizace uživatelů pro malé a střední počítačové sítě." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2008. http://www.nusl.cz/ntk/nusl-217309.

Full text
Abstract:
The main focus of this Master’s thesis is user authentication and access control in a computer network. I analyze the TCP/IP model in connection with security and describe main stepping stones of authentication protocols (mainly hash functions). The authentication protocol analysis follows. I begin with LANMAN protocol analysis for the reason of a security comparison. The NTLM, Kerberos and Radius follows. The focus is on the Kerberos which is chosen as a main authentication protocol. This is also a reason why the modification used in MS domains is described. The implementation and functional verification is placed in the second part which is more practical. The virtualization technology is used for an easier manipulation. The result is a computer network model requiring user authentication and minimizing the possibility of an attack by unauthorized clients.
APA, Harvard, Vancouver, ISO, and other styles
5

Piller, Igor. "Hashovací funkce a jejich využití při autentizaci." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-218115.

Full text
Abstract:
This thesis concerns with hash functions and their usage in authentication. It presents basics of hash functions theory and construction elements. In particular the thesis focuses on LMHash, MD4, MD5 and SHA family hash functions, which are compared from the security point of view. The thesis describes in general the most frequently used hash function attacks, points out the weaknesses of current construction and mentions the future perspective of hash functions. Furthermore the thesis outlines the area authentication and describes usage of hash functions in the area. Practical part of the thesis contains an implements of a general authentication framework implemented in programming language C#. The result is client and server applications, in which two selected authentication methods were successfully tested. The result implementation is flexible with respect to the possible future use of other authentication methods.
APA, Harvard, Vancouver, ISO, and other styles
6

Hadjichristofi, George Costa. "IPSec Overhead in Wireline and Wireless Networks for Web and Email Applications." Thesis, Virginia Tech, 2001. http://hdl.handle.net/10919/35710.

Full text
Abstract:
This research focuses on developing a set of secure communication network testbeds and using them to measure the overhead of IP Security (IPSec) for email and web applications. The network testbeds are implemented using both wireline and wireless technologies. The testing involves a combination of authentication algorithms such as Hashed Message Authentication Code-Message Digest 5 (HMAC-MD5) and Hashed Message Authentication Code-Secure Hash Algorithm 1 (HMAC-SHA1), implemented through different authentication protocols such as ESP and AH, and used in conjunction with the Triple Digital Encryption Standard (3DES). The research examines the overhead using no encryption and no authentication, authentication and no encryption, and authentication and encryption. A variety of different sizes of compressed and uncompressed files, are considered when measuring the overhead. The testbed realizes security using IPSec to secure the connection between different nodes. The email protocol that is used is the Simple Mail Transfer Protocol (SMTP) and the web protocol considered is the Hyper Text Transfer Protocol (HTTP). The key metrics considered are the network load in bytes, the number of packets, and the transfer time. This research emphasizes the importance of using HTTP to access files than using SMTP. Use of HTTP requires fewer packets, lower network loads, and lower transfer times than SMTP. It is demonstrated that this difference, which occurs regardless of security, is magnified by the use of authentication and encryption. The results also indicate the value of using compressed files for file transfers. Compressed and uncompressed files require the same transfer time, network load and number of packets since FreeS/WAN IPSec does not carry any form of compression on the data before passing it to the data link layer. Both authentication algorithms, HMAC-MD5 and HMAC- SHA1, result in about the same network load and number of packets. However, HMAC-SHA1 results in a higher transfer time than HMAC-MD5 because of SHA1's higher computational requirements. ESP authentication and ESP encryption reduce the network load for small files only, compared to ESP encryption and AH authentication. ESP authentication could not be compared with AH authentication, since the FreeS/WAN IPSec implementation used in the study does not support ESP authentication without using encryption. In a wireless environment, using IPSec does not increase the network load and the number of transactions, when compared to a wireline environment. Also, the effect of security on transfer time is higher compared to a wireline environment, even though that increase is overshadowed by the high transfer time percentage increase due to the wireless medium.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
7

Trusz, Jakob. "Content Management Systems and MD5: Investigating Alternative Methods of Version Identification for Open Source Projects." Thesis, Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-14821.

Full text
Abstract:
WordPress is a very widely used content management system that enables users to easier create websites. The popularity of WordPress has made it a prime target for attacks by hackers since a potential vulnerability would affect many targets. Vulnerabilities that can be utilised in an attack are referred to as exploits. Most exploits are only viable for a subset of all the version of the software that they target. The knowledge of which version of a content managements system a website is running is often not explicit or easy to determine. Attackers can potentially exploit a vulnerable website faster if the version is known, since this allows them to search for existing vulnerabilities and exploits, instead of trying to identify a new vulnerability. The purpose of this thesis is to investigate existing and alternate methods for detecting the version of WordPress on websites that are powered by it. The scope is limited to an analysis of existing tools and the suggested methods for version identification are limited to identification using unique values that are calculated from the contents of files. The suggested methods for version identification and the generation of the required data is implemented using Python 3, the programming language. We investigate the feasibility of version obfuscation, how discernible a version of WordPress is, and how to compare versions of WordPress. The thesis has proven the feasibility of version identification with a new perspective that delivers more accurate results than previous methods. Version obfuscation has also been proven to be very feasible without affecting the usability of the WordPress website. Furthermore, a method for discerning between two specific versions of WordPress is presented. All the results are in theory applicable to other software projects that are hosted and developed in the same way. This new area of research has much for security professionals and has room for future improvement.
APA, Harvard, Vancouver, ISO, and other styles
8

Lokby, Patrik, and Manfred Jönsson. "Preventing SQL Injections by Hashing the Query Parameter Data." Thesis, Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-14922.

Full text
Abstract:
Context. Many applications today use databases to store user informationor other data for their applications. This information can beaccessed through various different languages depending on what typeof database it is. Databases that use SQL can maliciously be exploitedwith SQL injection attacks. This type of attack involves inserting SQLcode in the query parameter. The injected code sent from the clientwill then be executed on the database. This can lead to unauthorizedaccess to data or other modifications within the database. Objectives. In this study we investigate if a system can be builtwhich prevents SQL injection attacks from succeeding on web applicationsthat is connected with a MySQL database. In the intendedmodel, a proxy is placed between the web server and the database.The purpose of the proxy is to hash the SQL query parameter dataand remove any characters that the database will interpret as commentsyntax. By processing each query before it reaches its destination webelieve we can prevent vulnerable SQL injection points from being exploited. Methods. A literary study is conducted the gain the knowledgeneeded to accomplish the objectives for this thesis. A proxy is developedand tested within a system containing a web server and database.The tests are analyzed to arrive at a conclusion that answers ours researchquestions. Results. Six tests are conducted which includes detection of vulnerableSQL injection points and the delay difference on the system withand without the proxy. The result is presented and analyzed in thethesis. Conclusions. We conclude that the proxy prevents SQL injectionpoints to be vulnerable on the web application. Vulnerable SQL injectionpoints is still reported even with the proxy deployed in thesystem. The web server is able to process more http requests that requiresa database query when the proxy is not used within the system.More studies are required since there is still vulnerable SQL injectionspoints.
APA, Harvard, Vancouver, ISO, and other styles
9

Karlsson, Marcus, and Oscar Zaja. "Improving Security In Embedded Systems With IEEE 802.1X." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-53322.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Dahlin, Karl. "Hashing algorithms : A comparison for blockchains in Internet of things." Thesis, Mittuniversitetet, Avdelningen för informationssystem och -teknologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-34057.

Full text
Abstract:
In today’s society blockchains and the Internet of things are two very discussed subjects this has led to thoughts about combining them by using a blockchain in Internet of things. This objective of this study has been to solve the problem which hashing algorithms is the best for a blockchain used in an Internet of things network. This have been done by first selecting a few hashing algorithms and setting up a scenario where a blockchain can be used in an Internet of things network. After that I specified what to compare, speed, input length and output length. The study has been conducted with the aid of literary studies about the hashing algorithms and a program that implements the algorithms and tests their speed. The study has shown that out of the selected hashing algorithms MD5, SHA-256, SHA3-256 with the conditions specified for the study that the hashing algorithm SHA3-256 is the best option if speed is not of the utmost importance in the scenario since it is from a newer standard and do not have a max input length. If speed is the very important in other words if SHA3-256 is to slow then SHA-256 would be best for the Internet of things network.
APA, Harvard, Vancouver, ISO, and other styles
11

Lorio, Ryan. "Feasibility of Determining Radioactivity in Lungs Using a Thyroid Uptake Counter." Thesis, Available online, Georgia Institute of Technology, 2005, 2005. http://etd.gatech.edu/theses/available/etd-08102005-173443/.

Full text
Abstract:
Thesis (M. S.)--Mechanical Engineering, Georgia Institute of Technology, 2006.
Ansari, Armin, Committee Member ; Hertel, Nolan, Committee Chair ; Wang, Chris, Committee Member. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
12

Jassim, Taha D. "Combined robust and fragile watermarking algorithms for still images. Design and evaluation of combined blind discrete wavelet transform-based robust watermarking algorithms for copyright protection using mobile phone numbers and fragile watermarking algorithms for content authentication of digital still images using hash functions." Thesis, University of Bradford, 2014. http://hdl.handle.net/10454/6460.

Full text
Abstract:
This thesis deals with copyright protection and content authentication for still images. New blind transform domain block based algorithms using one-level and two-level Discrete Wavelet Transform (DWT) were developed for copyright protection. The mobile number with international code is used as the watermarking data. The robust algorithms used the Low-Low frequency coefficients of the DWT to embed the watermarking information. The watermarking information is embedded in the green channel of the RGB colour image and Y channel of the YCbCr images. The watermarking information is scrambled by using a secret key to increase the security of the algorithms. Due to the small size of the watermarking information comparing to the host image size, the embedding process is repeated several times which resulted in increasing the robustness of the algorithms. Shuffling process is implemented during the multi embedding process in order to avoid spatial correlation between the host image and the watermarking information. The effects of using one-level and two-level of DWT on the robustness and image quality have been studied. The Peak Signal to Noise Ratio (PSNR), the Structural Similarity Index Measure (SSIM) and Normalized Correlation Coefficient (NCC) are used to evaluate the fidelity of the images. Several grey and still colour images are used to test the new robust algorithms. The new algorithms offered better results in the robustness against different attacks such as JPEG compression, scaling, salt and pepper noise, Gaussian noise, filters and other image processing compared to DCT based algorithms. The authenticity of the images were assessed by using a fragile watermarking algorithm by using hash function (MD5) as watermarking information embedded in the spatial domain. The new algorithm showed high sensitivity against any tampering on the watermarked images. The combined fragile and robust watermarking caused minimal distortion to the images. The combined scheme achieved both the copyright protection and content authentication.
APA, Harvard, Vancouver, ISO, and other styles
13

Jassim, Taha Dawood. "Combined robust and fragile watermarking algorithms for still images : design and evaluation of combined blind discrete wavelet transform-based robust watermarking algorithms for copyright protection using mobile phone numbers and fragile watermarking algorithms for content authentication of digital still images using hash functions." Thesis, University of Bradford, 2014. http://hdl.handle.net/10454/6460.

Full text
Abstract:
This thesis deals with copyright protection and content authentication for still images. New blind transform domain block based algorithms using one-level and two-level Discrete Wavelet Transform (DWT) were developed for copyright protection. The mobile number with international code is used as the watermarking data. The robust algorithms used the Low-Low frequency coefficients of the DWT to embed the watermarking information. The watermarking information is embedded in the green channel of the RGB colour image and Y channel of the YCbCr images. The watermarking information is scrambled by using a secret key to increase the security of the algorithms. Due to the small size of the watermarking information comparing to the host image size, the embedding process is repeated several times which resulted in increasing the robustness of the algorithms. Shuffling process is implemented during the multi embedding process in order to avoid spatial correlation between the host image and the watermarking information. The effects of using one-level and two-level of DWT on the robustness and image quality have been studied. The Peak Signal to Noise Ratio (PSNR), the Structural Similarity Index Measure (SSIM) and Normalized Correlation Coefficient (NCC) are used to evaluate the fidelity of the images. Several grey and still colour images are used to test the new robust algorithms. The new algorithms offered better results in the robustness against different attacks such as JPEG compression, scaling, salt and pepper noise, Gaussian noise, filters and other image processing compared to DCT based algorithms. The authenticity of the images were assessed by using a fragile watermarking algorithm by using hash function (MD5) as watermarking information embedded in the spatial domain. The new algorithm showed high sensitivity against any tampering on the watermarked images. The combined fragile and robust watermarking caused minimal distortion to the images. The combined scheme achieved both the copyright protection and content authentication.
APA, Harvard, Vancouver, ISO, and other styles
14

Mészáros, István. "Distributed P2P Data Backup System." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2013. http://www.nusl.cz/ntk/nusl-236378.

Full text
Abstract:
Tato diplomová práce představuje model a prototyp kooperativního distributivního systému zálohování dat založeném na P2P komunikační síti. Návrh systému umožňuje uživatelům přispět svým lokálním volným místem na disku do systému výměnou za spolehlivé úložiště jejich dat u jiných uživatelů. Představené řešení se snaží splnit požadavky uživatelů na ukládání dat, zároveň však také řeší, jak se vypořádat s mírou nepředvídatelnosti uživatelů  ohledně poskytování volného místa. To je prováděno dvěma způsoby - využitím Reed - Solomon kódů a zároveň také tím, že poskytuje možnost nastavení parametrů dostupnosti. Jedním z těchto parametrů je časový rozvrh, který značí, kdy uživatel může nabídnout předvídatelný přínos do systému. Druhý parametr se týká spolehlivosti konkrétního uživatele v rámci jeho slíbeného časového úseku. Systém je schopen najít synchronizaci ukládaných dat na základě těchto parametrů. Práce se zaměřuje rovněž na řešení zabezpečení systému proti širšímu spektru možných útoků. Hlavním cílem je publikovat koncept a prototyp. Jelikož se jedná o relativně nové řešení, je důležitá také zpětná vazba od široké veřejnosti, která může produkt používat. Právě jejich komentáře a připomínky jsou podnětem pro další vývoj systému.
APA, Harvard, Vancouver, ISO, and other styles
15

Heicke, Matthias. "Automated sequential composition of deltas and related optimization operations : An additional research to metamodel independent difference representation." Thesis, Mälardalen University, Mälardalen University, Mälardalen University, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-7478.

Full text
Abstract:

Model-Driven Engineering (MDE) leverages models to first-class status by shifting the focus of software development from coding to modeling. This thesis extends Antonio Cicchettis paper Difference Representation and Conflict Management in Model-Driven Engineering, adding concrete research corresponding to sequential composition.Differences between models can be displayed as deltas in a metamodel independent way. Working with these deltas, a need for sequential composites appears. This means, that several sequently deltas are marged together to a new delta. Since this delta contains a lot of unnecessary information, it needs to be optimized regarding to the minimal paradigm which is mentioned in the corresponding paper. This paper supplies the reader with a broad overview of the basic concepts, the difference representation and application including the metamodel independent approach, and finally a narrow examination of the research topic, including constraints, examples and implementation details.

APA, Harvard, Vancouver, ISO, and other styles
16

Mabrouki, Olfa. "Semantic Framework for Managing Privacy Policies in Ambient Intelligence." Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112319/document.

Full text
Abstract:
L'objectif de ce travail de thèse est de proposer un canevas sémantique intégrant un méta-modèle et des outils de raisonnement permettant à tout concepteur de système ubiquitaire de mettre en oeuvre facilement des mécanismes de gestion des politiques de la vie privée. Le canevas proposé intègre une architecture middleware générique qui offre des composants pour définir, administrer et contrôler l'application des politiques de confidentialité. Notre approche proposée est hybride. Elle est fondée sur l’ingénierie dirigée par les modèles et sur un raisonnement à base d'ontologies et de règles d'inférence opérant selon l'hypothèse du monde clos. Le méta-modèle proposé est caractérisé par un niveau d'abstraction et d'expressivité élevé permettant de définir des politiques de gestion de la vie privée indépendamment du domaine d'application pouvant être adaptées à différents contextes. Il définit, aussi, un cadre conceptuel pour établir des modèles de règles génériques et décidables permettant de prendre des décisions de contrôle cohérentes pour la protection de la vie privée. Ces modèles de règles sont mis en oeuvre grâce au langage de règles SmartRules permettant de mettre en oeuvre un contrôle adaptatif. Ce dernier est basé sur un raisonnement non-monotone et une représentation des instances de concepts selon la supposition du nom unique. Nous avons validé le canevas proposé à travers un scénario typique mettant en oeuvre des services d'assistance ambiante sensibles à la vie privée de personne âgée
This thesis aims at proposing a semantic framework that integrates a meta-model and reasoning tools allowing any ubiquitous system designer to easily implement mechanisms to manage privacy policies. The proposed framework includes a generic middleware architecture that provides components to define, manage and monitor the implementation of privacy policies. Our approach is an hybrid one based on Model-Driven Engineering and a reasoning based on ontologies and inference rules operating on the assumption of the closed world. The proposed meta-model is characterized by a high level of abstraction and expressiveness to define privacy policies management regardless of the domain application and can be adapted to different contexts. It defines, also, a conceptual framework for generic decidable modelling rules to make consistent control decisions on user privacy. These model rules are implemented using the SmartRules language that could implement an adaptive control. The latter is based on a non-monotonic reasoning and representation of instances of concepts according to the unique name assumption. We have validated the proposed semantic framework through a typical scenario that implements support ambient intelligence privacy-aware services for elderly
APA, Harvard, Vancouver, ISO, and other styles
17

ARAUJO, GIL MACHADO GUIGON DE. "CHALLENGES FOR APPLYING THE CRADLE-TO-CRADLE METHODOLOGY TO THE LIFE CYCLE OF MDF AND MDP FURNITURE." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2012. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=35163@1.

Full text
Abstract:
A crescente percepção da importância da proteção ambiental e dos impactos associados aos bens de consumo tem aumentado o interesse no desenvolvimento de métodos para melhor compreender e lidar com estes impactos. Neste contexto, a metodologia berço-ao-berço (C2C) descreve uma maneira de projetar produtos com ciclos de vida (biológicos ou técnicos) fechados, para evitar a perda de nutrientes do solo e de matéria-prima não renovável. Na indústria de mobiliário, a madeira tem sido cada vez mais substituída por painéis industrializados (MDF e MDP), que oferecem maior produtividade, matéria prima renovável e um posicionamento sustentável de mercado. A maneira como são descartados hoje, no entanto, não está alinhada com esse discurso. A partir de revisão bibliográfica, visitas e entrevistas, o ciclo de vida do móvel de MDF e MDP foi descrito para que, posteriormente pudessem ser avaliadas as possibilidades de adequação à metodologia C2C. Ao final do trabalho, concluiu-se que por ser um material que pode ser adequado ao ciclo biológico ou ao ciclo técnico, seu potencial para o fechamento do ciclo é grande, no entanto devem ser superadas barreiras como a utilização de componentes nocivos à saúde em sua composição e o descarte fragmentado dos móveis.
The perception that human activities might have significant impact in the environment led, in the last decades, to the development of policies and methodologies to better understand and handle the subject. The United Nations Conference on Environment and Development (UNCED), also known as the Rio Summit or ECO-92, was a major event in that direction. More than one hundred heads of state gathered with other representatives from the society, industries and environmentalists to discuss about sustainable development and global warming (MCDONOUGH and BRAUNGART, 2002). One of the negotiation s result was the definition of the eco-eficiency strategy, that guided the industry approach to the issue in the two past decades. Reducing direct and indirect environmental impact in every possible opportunity became one of the main strategies to reach such eco-eficiency. To identify these possibilities, the product s life cycle became focus of studies from researchers, companies and governments. The Law no. 12.305/2010, for example, defines life cycle as the series of stages related to de development of the product, the acquisition of raw material, the production process, consumption and final disposal. In a similar way, the International Standards Organization (ISO) defines life cycle as consecutive and interlinked stages of a product system, from raw material acquisition or generation from natural resources to final disposal. 14000 series os norms, from ISO, is one of the main tools to provide inputs to this debate. More specific, 14020 and 14040 regulate environmental labels and life cycle assessment (LCA). While labels are essential to communicate to the market the characteristics of a product, LCA is one of the most complete tool to help decisions related to the development of products or services. It can be applied to the whole life cycle, from raw material extraction to disposal, or in specific stages, and allows for a comparison between two different solutions based in the same functional unit, or objective, making it easier to identify the least negative impact. Cradle-to-cradle methodology (C2C) developed by MCDONOUGH and BRAUNGART (2002) takes a step further in the approach of the environmental impact reduction, suggesting that instead of reducing the negative impact, products and services should have a positive impact in the environment.
APA, Harvard, Vancouver, ISO, and other styles
18

OLIVEIRA, Thiago Araújo Silva de. "Geração de código estrutural implantável em nuvens a partir de modelos de componentes independentes de plataforma." Universidade Federal de Pernambuco, 2011. https://repositorio.ufpe.br/handle/123456789/2811.

Full text
Abstract:
Made available in DSpace on 2014-06-12T16:01:18Z (GMT). No. of bitstreams: 2 arquivo7566_1.pdf: 2847690 bytes, checksum: 1d3626862b82aca95ac1d01b74011871 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2011
Model-Driven Engineering (MDE) visa melhorar a produtividade e qualidade de software, deslocando recursos que na maioria dos projetos são gastos em questões específicas da plataforma de programação para direcionar esforços somente as questões de negocio, independentes de plataforma. No âmbito de um projeto com objetivo de implementação em uma unica plataforma, o retorno do investimento em modelos é claro somente se grande parte do codigo for gerado automaticamente a partir de modelos independentes de plataforma (PIM). No entanto, esse serviço ainda e um desafifio, uma meta a ser atingida. Esta dissertação de mestrado contribui para o projeto WAKAME e mostra que esse objetivo e alcançável. O projeto concentra esforcos na construção de uma ferramenta CASE MDE disponvel como uma aplicação WEB. Com o WAKAME, o desenvolvedor pode especifificar o PIM da aplicação editando visões na ferramenta. As visões estruturais usam diagramas de classes UML, enquanto as operacionais utilizam expressões em OCL Imperativa. Essas visões são unificadas dentro de um modelo unificado (SUM), alvo das transformações. O WAKAME almeja que ao se concluir especificação do PIM, o usuario possa automaticamente realizar a geração de codigo e a implantação da aplicação no servico de nuvem da Google. Dentro desse objetivo, essa dissertação contribui na geração de codigo estrutural e nas tarefas de infraestrutura da aplicação. Metodologicamente, este trabalho tambem contribui com uma inovadora arquitetura com duas fases de geração de codigo: 1) criação de uma nova representação do modelo atraves de um framework de transformação independente de plataforma; 2) realizar a transformação da representação em objetos para codigo atraves de um motor de templates. A nova representação oferece uma arquitetura extensivel para outras plataformas
APA, Harvard, Vancouver, ISO, and other styles
19

Campoy, Ungria Jose Manuel. "Nueva metodología para la obtención de distancias de visibilidad disponibles en carreteras existentes basada en datos LiDAR terrestre." Doctoral thesis, Universitat Politècnica de València, 2015. http://hdl.handle.net/10251/59062.

Full text
Abstract:
[EN] The existence of a visibility that is appropriate to the actual operating conditions is a sine qua non to achieve a safe geometric design. The sight distances required in driving tasks, such as decision-making, stopping, overtaking or crossing, represent an essential parameter in the geometric design of new roads; and they play a key role in all international design guidelines. Nevertheless, once the road has been built and operating, many other surrounding circumstances do determine the actual sight distance available over time. Moreover, since geometric design guidelines encompass visibility measurements based on the observer and the obstacle located on the roadway, systematic and periodic measurements prove difficult and tedious as well as risky and traffic-disruptive. In engineering practice, it is common to use elevation digital models and geometric design specific programs to establish the visibility conditions on roads; however, the development of new remote sensing technologies expand the possibilities to better estimate the visibility actually available. LiDAR technology has been enjoying a boost internationally in recent years. It is an important source of information that consists of millions of georeferenced points belonging to all kinds of objects, which represent not only the geometry of the road itself, but also its more immediate surroundings. It is precisely this ability to include all sorts of potential obstacles to vision in the analysis that raised our interest. This PhD thesis presents a newly developed and tested methodology for the systematic assessment of visibility available on roads that deploys visuals directly drawn against the LiDAR point cloud. To this purpose the concepts of Visual Prism (VP) and Rectangular Prismatic Unit (RPU) have been defined as key elements in this new way of thinking about vision. They represent an alternative to the traditional straight line drawn between the observer and the object. During the research, the impact on the results of the point cloud density has been analyzed; and this methodology has been compared to the visibility results yielded by known techniques based upon digital terrain models, digital surface models and project profiles in two existing road sections. In general, conventional methods overestimate sight distance compared to the new methodology based on LiDAR data, and in many cases the overestimation is significant.. The development, that displays both visuals and three dimensional point cloud results, also enables to spot the reason for the obstruction of vision. This improvement is practice-ready and could be used while assessing the road and improving the conditions of sight distance and road safety.
[ES] La existencia de una visibilidad adecuada a las condiciones reales de operación, es condición indispensable para alcanzar un diseño geométrico seguro. Las distancias de visibilidad requeridas para tareas inherentes a la conducción, tales como la decisión, la parada, el adelantamiento o el cruce, constituyen un parámetro esencial en el diseño geométrico de nuevas carreteras, formando parte importante de todas las guías de diseño a nivel internacional. Sin embargo, una vez construida la carretera y durante el tiempo en que esta se encuentra en servicio, muchas otras circunstancias de su entorno condicionan la visibilidad realmente disponible a lo largo del tiempo. Por otro lado, dado que las guías de diseño geométrico contemplan las mediciones de visibilidad disponible con el observador y el obstáculo situados sobre la calzada, su medición sistemática y periódica es una complicada y tediosa labor no exenta de riesgos y de perturbaciones al tráfico. En la práctica ingenieril, es habitual el empleo de modelos digitales de elevaciones y de programas específicos de diseño geométrico para establecer las condiciones de visibilidad en carreteras; no obstante, el desarrollo de nuevas tecnologías de teledetección amplían las posibilidades a una mejor estimación de la visibilidad realmente disponible. La tecnología LiDAR está gozando de un importante impulso a nivel internacional en los últimos años y constituye una importante fuente de información consistente en millones de puntos georreferenciados pertenecientes a todo tipo de objetos que representan no solo la geometría de la propia carretera, sino también su entorno más inmediato. Precisamente por su capacidad de incluir en el análisis todo tipo de obstáculos potenciales a la visión, en la presente Tesis Doctoral se ha desarrollado y analizado una nueva metodología de evaluación sistemática de visibilidades disponibles en carreteras a partir de visuales trazadas directamente contra la nube de puntos LiDAR. Para ello se han definido por primera vez los conceptos de Prisma Visual (PV) y de Unidad Prismática Rectangular (UPR) como elementos básicos constitutivos de esta nueva forma de concebir la visión, alternativos a la tradicional línea recta visual trazada entre el observador y el objetivo. Durante la investigación se ha analizado el efecto de la densidad de la nube de puntos en los resultados y se ha sometido esta metodología a comparación con los resultados de visibilidad obtenidos por técnicas conocidas a partir de modelos digitales del terreno, modelos digitales de superficies y perfiles de proyecto en dos tramos de carretera existentes. En general, se obtiene una sobreestimación generalizada y en muchos casos significativa de las visibilidades realmente disponibles si se emplean metodologías convencionales en comparación con las obtenidas a partir de la nueva metodología basada en datos LiDAR. El desarrollo, preparado para la visualización conjunta de resultados de visuales y nube de puntos en tres dimensiones, permite asimismo interpretar el motivo de la obstrucción a la visión, lo que constituye un avance puesto al servicio de los ingenieros en la evaluación de la carretera y en la mejora de sus condiciones de visibilidad y de seguridad vial.
[CAT] L'existència d'una visibilitat adequada a les condicions reials d'operació, es condició indispensable per a aconseguir un disseny geomètric segur. Les distàncies de visibilitat requerides per a tasques inherents a la conducció, tals com la decisió, la parada, l'avançament, o l'encreuament, constitueixen un paràmetre essencial en el disseny geomètric de noves carreteres, formant part important de totes les guies de disseny a nivell internacional. No obstant, una volta construïda la carretera i durant el temps en què es troba en servici, moltes altres circumstancies del seu entorn condicionen la visibilitat realment disponible. D'altra banda, donat que les guies de disseny geomètric contemplen les mesures de visibilitat disponible en l'observador i el obstacle situats sobre la calçada, la seua medició es una complicada i tediosa llavor no exempta de riscs i de molèsties al trànsit. En la practica, es habitual l'ús de models digitals d'elevacions i de programes específics de disseny geomètric per a establir les condicions de visibilitat en carreteres; no obstant, el desenvolupament de noves tecnologies de tele-detecció amplien les possibilitats a una millor estima de la visibilitat realment disponible. La tecnologia LIDAR està gojant d'un important impuls a nivell internacional en els ultims anys i constitueix una important font d'informació consistent en milions de punts geo-referenciats de tot tipus d'objectes que representen no nomes la geometria de la pròpia carretera, sinó també el seu entorn mes immediat. Precisament per la seua capacitat d'incloure en l'analisis tot tipus d'obstacles potencials a la visió, en el present tesis doctoral s'ha analitzat una nova metodologia d'avaluació sistemàtica de visibilitats disponibles en carreteres a partir de visuals traçades directament contra el núvol de punts LIDAR. Per tal motiu s'han definit per primera vegada els conceptes de Prisma Visual (PV) i d'Unitat Prismàtica Rectangular (UPR) com a elements bàsics constitutius d'aquesta nova forma de concebre la visió, alternatius a la tradicional línia recta visual traçada entre l'observador i el objectiu. Durant la investigació s'ha analitzat l'efecte de la densitat del núvol de punts en els resultats i s'ha sotmès aquesta metodologia a comparació amb els resultats de visibilitat obtinguts per tècniques conegudes a partir de models digitals del terreny, models digitals de superfícies i perfils de projecte en dos trams de carretera existents. En general, s'obté una sobreestimació generalitzada i en molts casos significativa de les visibilitats realment disponibles si s'empren metodologies convencionals en comparació amb les obtingudes a partir de la nova metodologia basada en dades LiDAR. El desenvolupament, preparat per a la visualització conjunta de resultats de visuals i núvol de punts en tres dimensions, permet així mateix interpretar el motiu de l'obstrucció a la visió, el que constitueix un avanç posat al servei dels enginyers en l'avaluació de la carretera i en la millora de les seves condicions de visibilitat i de seguretat viària.
Campoy Ungria, JM. (2015). Nueva metodología para la obtención de distancias de visibilidad disponibles en carreteras existentes basada en datos LiDAR terrestre [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/59062
TESIS
APA, Harvard, Vancouver, ISO, and other styles
20

Viana, Diogenes Carvalho. "Análise da qualidade cartográfica de MDS e MDE gerados por VANT e refinados com uso de dados GNSS RTK." Universidade Federal de Viçosa, 2017. http://www.locus.ufv.br/handle/123456789/19991.

Full text
Abstract:
Submitted by Reginaldo Soares de Freitas (reginaldo.freitas@ufv.br) on 2018-06-08T12:44:39Z No. of bitstreams: 1 texto completo.pdf: 1636105 bytes, checksum: fe486462f6641f82f1364d701c258bd6 (MD5)
Made available in DSpace on 2018-06-08T12:44:39Z (GMT). No. of bitstreams: 1 texto completo.pdf: 1636105 bytes, checksum: fe486462f6641f82f1364d701c258bd6 (MD5) Previous issue date: 2017-12-19
Dentre as técnicas utilizadas na aquisição de dados destacam-se os levantamentos com Veículos Aéreos Não Tripulados (VANT) e levantamentos de posicionamento relativo em tempo real (RTK – Real Time Kinematic). O uso dessas duas tecnologias é de grande interesse na geração de Modelos Digitais de Elevação (MDE) pela praticidade e facilidade de aquisição de dados. O objetivo do estudo é avaliar a qualidade dos MDEs gerados com base nos dados de VANT e GNSS RTK, bem como propor e avaliar a integração de tais dados, visando o refinamento do MDE. A avaliação foi realizada através de coleta de dados com Estação Total e receptor GNSS RTK em uma área pertencente a Universidade Federal de Viçosa localizada na cidade de Viçosa-MG. O Modelo Digital de Superfície (MDS) foi filtrado utilizando o software LAStools para geração do MDE. Os dados de VANT foram refinados a partir dos dados de receptor GNSS RTK na tentativa de melhorar sua qualidade cartográfica. Na primeira abordagem foram adicionados pontos de controle obtidos por GNSS RTK nos locais onde foram geradas lacunas de dados pelo processo de filtragem. Já na segunda abordagem foram selecionados pontos de duas formas distintas: direções e malha regular. Após a seleção dos pontos calculou-se a discrepância entre os mesmos e os modelos (MDE e MDS). Assim, os valores das discrepâncias foram interpolados e cada modelo gerado foi subtraído dos modelos originais gerando um novo MDE ou MDS refinado. Todos os modelos foram validados com os pontos obtidos por topografia e classificados de acordo com o padrão de acurácia posicional definido pelo Decreto 89.817 / ET-CQDG. Ao avaliar os modelos gerados constatou-se que a integração dos dados obtidos por VANT com dados obtidos por receptor GNSS RTK apresentou melhoria na qualidade posicional na maioria dos modelos analisados indicando ser uma ferramenta promissora para alcançar modelos de maior qualidade cartográfica ou melhorar modelos já existentes.
Among the techniques used in the data acquisition, we highlight the surveys with Unmanned Aerial Vehicles (UAV) and Real Time Kinematic (RTK) positioning surveys. The use of these two technologies is of great interest in the generation of Digital Elevation Models (DEM) for the convenience and ease of data acquisition. The objective of the study is to evaluate the quality of the DEM generated based on the UAV and GNSS RTK data, as well as to propose and evaluate the integration of such data, aiming the refinement of the DEM. The evaluation was performed through data collection with Total Station and GNSS RTK receiver in an area belonging to the Federal University of Viçosa located in the city of Viçosa-MG. The Digital Surface Model (DSM) was filtered using the LAStools software to generate the DEM. The UAV data were refined from GNSS RTK receiver data in an attempt to improve their cartographic quality. In the first approach were added control points obtained by GNSS RTK in the places where data gaps were generated by the filtering process. In the second approach, points were selected in two different ways: directions and regular mesh. After selecting the points, the discrepancy between them and the models (DEM and DSM) was calculated. Thus, the discrepancy values were interpolated and each generated model was subtracted from the original models generating a new refined DEM or DSM. All models were validated with the points obtained by topography and classified according to the positional accuracy pattern defined by Decree 89.817 / ET-CQDG. When evaluating the generated models it was verified that the integration of the data obtained by UAV with data obtained by GNSS RTK receiver showed improvement in positional quality in most of the analyzed models, indicating that it is a promising tool to reach models of higher cartographic quality or to improve existing models.
APA, Harvard, Vancouver, ISO, and other styles
21

Estima, Maria Inês Duarte Ramos. "Comparação de modelos tridimensionais produzidos com imagens adquiridas por UAV e avaliação de volumes." Master's thesis, Universidade de Aveiro, 2016. http://hdl.handle.net/10773/16919.

Full text
Abstract:
Mestrado em Geoinformática
Na área da geoengenharia, um dos grandes obstáculos sempre foram os custos associados às operações inerentes à realização de trabalhos nos diferentes tipos de serviços e produtos que se podem elaborar nesta tão vasta área. É importante salientar como grande benefício a utilização de novas ou mais recentes tecnologias que minimizem estes custos, e tenham como resultados iguais ou melhores produtos e serviços, comparativamente com os realizados recorrendo a métodos e tecnologias tradicionais. Contudo, e como resultado destas necessidades, foram surgindo nos últimos tempos inúmeros sistemas, de inúmeros fornecedores, nas diversas variantes da geoengenharia, facto que se traduz numa larga lista de possibilidades de utilização de equipamentos e ferramentas para estes fins. Assim, nesta dissertação, para além de avaliar a exatidão e precisão destas novas tecnologias comparativamente com métodos tradicionais, pretende-se também estabelecer comparação dentro das novas tecnologias, entre as várias possibilidades existentes no mercado. Os temas de dissertação apresentados enquadram-se no projeto “ROADMAP BCP BAIÃO”, que consiste na Modelação 3D, com base em dados levantados com recurso a UAV, e posterior comparação e avaliação de resultados dos diferentes modelos gerados com recurso a vários softwares (Parte 1 da dissertação), e “FERROVIAL Aterro 16”, para cálculo de volumes, com base em dados de aterro sanitário levantados com recurso a UAV para posterior comparação com cálculo efetuado com base em topografia, e dados de maquetes com volumes conhecidos, para comparação (Parte 2 da dissertação) , a desenvolver pela Geolayer – Estudos de Território, Lda. Na primeira parte serão abordados os trabalhos inerentes ao desenvolvimento e criação de ambiente tridimensional com a finalidade de promoção de propriedade no mercado imobiliário, permitindo a manipulação e visualização 3D por parte da entidade detentora da mesma, mas principalmente por possíveis compradores. A propriedade situa-se no concelho de Baião, é propriedade da instituição bancária Millenuium BCP, e é composta por terreno e edificação que se encontra em zona de acesso condicionado. A segunda parte da dissertação assentará num teste de comparação entre volumes de aterro calculados com dados levantados com topografia, e volumes de aterro calculados com dados levantados com recurso a sistema UAV. O aterro sanitário em causa designa-se por “Aterro do Planalto Beirão”, situa-se no concelho de Tondela, e é concessionado pela entidade agora responsável FERROVIAL SERVIÇOS SA.
In the area of geoengineering, one of the major obstacles have always been the costs associated with the operations involved in carrying out work in different types of services and products that may develop in this vast area. It is important to point out how great is the use of new or newer technologies that minimize these costs, and have as results equal to or better products and services, compared to those made using traditional methods and technologies. However, as a result of these needs have appeared in recent times numerous systems, from many providers vendors of geoengineering systems fact which translates into a long list of possibilities for using equipment and tools for these purposes. Thus, in this thesis, in addition to evaluating the accuracy and precision of these new technologies compared to traditional methods, it is intended to also establish comparison within the new technologies among the various possibilities on the market. The presented dissertation topics fall within the project "ROADMAP BCP BAIÃO", which consists of 3D modeling based on data collected using the UAV, and subsequent comparison and evaluation of results of different models generated using various software (Part 1 of the dissertation), and "FERROVIAL ATERRO 16" for volumes calculation, based on landfill of data collected using the UAV to be compared with calculations made based on topography, and models of data with known volumes for comparison (Part 2 of the dissertation), to be developed by Geolayer - Estudos de Território, Lda. The first part will deal with the inherent development work and creating three-dimensional environment for property promotion purpose in the housing market, allowing the manipulation and 3D visualization by the entity holding it, but mainly for potential buyers. The property is located in the municipality of Baião, and is owned by the bank Millenuium BCP, and is composed of land and building which is in restricted access zone. The second part of the dissertation will be based on a comparison test between landfill volumes calculated with data collected with topography, and landfill volumes calculated with data collected using UAV system. The landfill in question is called a “Aterro do Planalto Beirão”, is located in Tondela municipality, and is now responsible for the concession entity FERROVIAL SERVIÇOS SA.
APA, Harvard, Vancouver, ISO, and other styles
22

Karásek, Jan. "Hashovací funkce - charakteristika, implementace a kolize." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-218059.

Full text
Abstract:
Hash functions belong to elements of modern cryptography. Their task is to transfer the data expected on the entry into a unique bite sequence. Hash functions are used in many application areas, such as message integrity verification, information authentication, and are used in cryptographic protocols, to compare data and other applications. The goal of the master’s thesis is to characterize hash functions to describe their basic characteristics and use. Next task was to focus on one hash function, in particular MD5, and describe it properly. That means, to describe its construction, safety and possible attacks on this function. The last task was to implement this function and collisions. The introductory chapters describe the basic definition of hash function, the properties of the function. The chapters mention the methods preventing collisions and the areas were the hash functions are used. Further chapters are focused on the characteristics of various types of hash functions. These types include basic hash functions built on basic bit operations, perfect hash functions and cryptographic hash functions. After concluding the characteristics of hash functions, I devoted to practical matters. The thesis describes the basic appearance and control of the program and its individual functions which are explained theoretically. The following text describes the function MD5, its construction, safety risks and implementation. The last chapter refers to attacks on hash functions and describes the hash function tunneling method, brute force attack and dictionary attack.
APA, Harvard, Vancouver, ISO, and other styles
23

Beran, Martin. "Elektronická podatelna VUT 2." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2007. http://www.nusl.cz/ntk/nusl-412777.

Full text
Abstract:
This dissertation thesis attends to problems of electronic registry for VUT. It deals with the principal of electronic registry functioning, electronic signature and it compares offer of the commercial registries. It goes in for the proposal and implementation of the electronic registry for VUT. Since the using of the e- registry on all public service Office was legalized the people can avoid long queues and the employees are avoided from the stress before dead lines. By the communication through the electronic registry is very important the electronical signature. It is almost a full-valued and lawful alternative to the physical signature. For its safety and utility this system employes asymmetric codes and hash algorithm. Presently in many states, where the electronical signature is legalized it is used together with standard X 509 which defines the format of certificates, organization and action of certification authorities. The certification autority ensures safe connection of the person and general key for using of the electronical signature.
APA, Harvard, Vancouver, ISO, and other styles
24

Belix, José Eduardo. "Um estudo sobre MDA: suporte fornecido pela UML e reuso de soluções pré-definidas." Universidade de São Paulo, 2006. http://www.teses.usp.br/teses/disponiveis/3/3141/tde-11052006-165548/.

Full text
Abstract:
Este trabalho tem por objetivo propor a utilização de soluções pré-definidas e comprovadas na abordagem MDA, auxiliando o desenvolvedor na resolução de problemas recorrentes ao desenvolvimento de software. A utilização destas soluções pré-definidas leva a um aumento de produtividade no contexto MDA e na geração de software baseado em boas práticas. Para que este objetivo seja cumprido, é empreendida uma análise sobre o MDA e sobre como operacionalizar as transformações entre modelos. Também é empreendida uma análise sobre o suporte fornecido pela UML e sobre reutilização em desenvolvimento orientado a modelos. Por fim este trabalho apresenta partes de uma aplicação protótipo, construída para ser uma prova de conceito de código gerado através da combinação de UML e soluções pré-definidas.
The goal of this work is to propose the use of pre-defined solutions on MDA approach, supporting the developer in solving recurrent problems of software development. The use of these pre-defined solutions leads to an increase of productivity in MDA context, and in the generation of software based on best practices. To reach this goal, an analysis of MDA is undertaken, as well as an analysis of how to enable the transformations between models. It is also undertaken an analysis about the use of UML and the reuse in model driven development. Finally this work presents portions of a prototype application, constructed to be a proof-of-concept of generated code, combining UML and the pre-defined solutions.
APA, Harvard, Vancouver, ISO, and other styles
25

Kukačka, Pavel. "Master Data Management a jeho využití v praxi." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-73506.

Full text
Abstract:
This thesis deals with the Master Data Management (MDM), specifically its implementation. The main objectives are to analyze and capture the general approaches of MDM implementation including best practices, describe and evaluate the implementation of MDM project using Microsoft SQL Server 2008 R2 Master Data Services (MDS) realized in the Czech environment and on the basis of the above theoretical background, experiences of implemented project and available technical literature create a general procedure for implementation of the MDS tool. To achieve objectives above are used these procedures: exploration of information resources (printed, electronic and personal appointments with consultants of Clever Decision), cooperation on project realized by Clever Decision and analysis of tool Microsoft SQL Server 2008 R2 Master Data Services. Contributions of this work are practically same as its goals. The main contribution is creation of a general procedure for implementation of the MDS tool. The thesis is divided into two parts. The first (theoretically oriented) part deals with basic concepts (including definition against other systems), architecture, implementing styles, market trends and best practices. The second (practically oriented) part deals at first with implementation of realized MDS project and hereafter describes a general procedure for implementation of the MDS tool.
APA, Harvard, Vancouver, ISO, and other styles
26

Jonsson, Erik, and Mikael Persson. "MDP for Symbian." Thesis, Linköping University, Department of Computer and Information Science, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-14703.

Full text
Abstract:

The content of this report describes a bachelor thesis performed on commission by Cybercom Sweden West. The report describes techniques, methods and development tools used during the project. The purpose of this project is to demonstrate the usefulness with a new Bluetooth profile called Medical Device Profile (MDP) on the Medica trade fair 14-11-2007. The MDP profile that will be released in turn of the ear 07/08 is primarily intended to be used in medical devices.The demonstration is to be made by building a demonstrator consisting of an application running on a smartphone with Bluetooth abilities. The application will handle the Bluetooth connection between the smartphone and the oximeter, the data encryption and other functionalities and presenting the figures received form the oximeter in a Graphical User Interface (GUI).The final demonstrator consists of a smartphone application programmed in Symbian C++, which communicates with the oximeter using the Bluetooth Serial Port Profile (SPP). The application runs on UIQ 3.0 based smartphones and displays heart rate and %SpO2 (The percentage of oxygen saturation in blood), which the application receives from the oximeter.The original idea was to use the MDP profile by porting Cybercom's C version of the MDP to Symbian OS for the Bluetooth communication, depending on various factors described more in detail inside the report this was not done. The purpose of the project was still reached, even though another profile then MDP was used. This was done by replacing the considered oximeter for an older version which is using the SPP. By using SPP the same result will be demonstrated but with an older technique, and this will achieve the same result on Medica trade fair. The project was demonstrated on Medica with a successful result.

APA, Harvard, Vancouver, ISO, and other styles
27

Chaves, Rafael Alves. "Aspectos e MDA." Florianópolis, SC, 2004. http://repositorio.ufsc.br/xmlui/handle/123456789/87201.

Full text
Abstract:
Dissertação (mestrado) - Universidade Federal de Santa Catarina, Centro Tecnológico. Programa de Pós-Graduação em Ciência da Computação.
Made available in DSpace on 2012-10-21T16:17:57Z (GMT). No. of bitstreams: 1 235612.pdf: 591706 bytes, checksum: 504520bc0ea1832a0e9fadc7b3c69fed (MD5)
As principais contribuições deste trabalho consistem em analisar o potencial do uso conjunto das abordagens MDA e orientação a aspectos, e propor extensões à UML para comportar a criação de modelos executáveis usando o paradigma de aspectos.
APA, Harvard, Vancouver, ISO, and other styles
28

Wendelin, Emma, and Emil Hallberg. "Omplacering av MDU." Thesis, Tekniska Högskolan, Högskolan i Jönköping, JTH, Maskinteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-30758.

Full text
Abstract:
This report aims to describe the work of a bachelor thesis with a major in product development at Saab. The bachelor thesis is to improve one of Saabs existing products called MIT, short for “Moving Infantry Target”. MIT is used all around the world in harsh environment with high standards. MIT consists of one of Saabs other product SIT, short for “Stationary Infantry Target”, mounted on a trolley. SIT is a standard product of Saab but has to be modified before being mounted on MIT. The modification consists of the MDU, short for “Motor driving Unit”, and is a circuit board for controlling the electric engines located on the MIT trolley. The thesis aims to relocate the MDU from inside the SIT to the MIT trolley and still fulfill the requirements of the MIT. One important requirement is that all products need to have IP classification 67. IP classification is a standard measurement on how water and dust proof a product is. To fulfill the requirements a protective box for MDU was developed. During the process there was some question in focus. Where should the box be located? Which manufacturing and materials should be used? How should the cable go inside the box and at the same time have IP classification 67? This was the basis for the work when developing the concepts. Two concepts were created in total. The first concept is about the location and how to attach MDU with the protective box. Next step concept is about how the lid will be attached and the cable lead-through. A 3D model was created using the winning concepts and from that 3D model a prototype was created. The prototype was manufactured with a 3D printer.
APA, Harvard, Vancouver, ISO, and other styles
29

Schmid, Christian, and Yigal Gerchak. "How should a principal reward and support agents when firm performance is characterized by success or failure?" John Wiley & Sons, Ltd, 2019. http://dx.doi.org/10.1002/mde.3006.

Full text
Abstract:
Principal-agent models with multiple agents typically assume that the principal wishes to maximize the sum of the agents' achievements (net of the rewards paid to them). But in many settings, like R&D, all that the principal "Needs" is that at least one agent will be "successful." We identify settings where the principal actually wants agents to refrain from exerting high effort in order to save expected compensation. We show that the number of agents can decrease in the project's value for the principal. We also consider sequential efforts and investigate settings where the principal can provide support to agents.
APA, Harvard, Vancouver, ISO, and other styles
30

Willden, Greg C., Ray D. Seegmiller, Maria S. Araujo, Ben A. Abbott, and William A. Malatests. "Vendor Interoperability Through MDL." International Foundation for Telemetering, 2011. http://hdl.handle.net/10150/595668.

Full text
Abstract:
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada
Describing data formats has gone a long way in providing a common thread for moving test programs from one test range to another without incurring massive code rewrites. The introduction of the IRIG 106-93 standard provided the Telemetry Attributes Transfer Standard (TMATS) to achieve interoperability between the test article and ground processing system. The integrated Network Enhanced Telemetry (iNET) Metadata Description Language (MDL) extends the concept to include descriptions of the equipment configuration and setup. This MDL declarative language is both vendor neutral and vendor customizable (where needed) and extends interoperability down to the individual components of the instrumentation system. This paper describes the current state of MDL and its use across intended vendor lines.
APA, Harvard, Vancouver, ISO, and other styles
31

Čvančarová, Lenka. "MDM of Product Data." Master's thesis, Vysoká škola ekonomická v Praze, 2012. http://www.nusl.cz/ntk/nusl-150246.

Full text
Abstract:
This thesis is focused on Master Data Management of Product Data. At present, most publications on the topic of MDM take into account customer data, and a very limited number of sources focus solely on product data. Some resources actually do attempt to cover MDM in full-depth. Even those publications are typically are very customer oriented. The lack of Product MDM oriented literature became one of the motivations for this thesis. Another motivation was to outline and analyze specifics of Product MDM in context of its implementation and software requirements for a vendor of MDM application software. For this I chose to create and describe a methodology for implementing MDM of product data. The methodology was derived from personal experience on projects focused on MDM of customer data, which was applied on findings from the theoretical part of this thesis. By analyzing product data characteristics and their impacts on MDM implementation as well as their requirements for application software, this thesis helps vendors of Customer MDM to understand the challenges of Product MDM and therefore to embark onto the product data MDM domain. Moreover this thesis can also serve as an information resource for enterprises considering adopting MDM of product data into their infrastructure.
APA, Harvard, Vancouver, ISO, and other styles
32

Arndt, Bruno Felipe. "MME-MDD : um método para manutenção e evolução de sistemas baseados no MDD." Universidade Federal de São Carlos, 2016. https://repositorio.ufscar.br/handle/ufscar/8503.

Full text
Abstract:
Submitted by Alison Vanceto (alison-vanceto@hotmail.com) on 2017-02-07T10:21:15Z No. of bitstreams: 1 DissBFA.pdf: 12687971 bytes, checksum: 19789fb95e5c01987f5067e083d04248 (MD5)
Approved for entry into archive by Camila Passos (camilapassos@ufscar.br) on 2017-02-08T12:04:07Z (GMT) No. of bitstreams: 1 DissBFA.pdf: 12687971 bytes, checksum: 19789fb95e5c01987f5067e083d04248 (MD5)
Approved for entry into archive by Camila Passos (camilapassos@ufscar.br) on 2017-02-08T12:08:22Z (GMT) No. of bitstreams: 1 DissBFA.pdf: 12687971 bytes, checksum: 19789fb95e5c01987f5067e083d04248 (MD5)
Made available in DSpace on 2017-02-08T12:09:45Z (GMT). No. of bitstreams: 1 DissBFA.pdf: 12687971 bytes, checksum: 19789fb95e5c01987f5067e083d04248 (MD5) Previous issue date: 2016-03-08
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Model-Driven Development (MDD) is proposed to reduce the semantic gap between problem and solution/implementation domains. Some tools are used, and the code generator is commonly used in this context. These generators are often implemented using templates. A Reference Implementation (RI) favoring the development/ maintenance of software facilitates this implementation’s types. However, RI requires a code migration process which consists in artifacts’ synchronization, and it is responsible for 20% to 25% of the time spent on development. The literature has no describe automatic solutions, but the group that this reaserch was included has develop tools that automate this process and reduce the time spent on some tasks. But each task has a different performance in relation to time spent, so automation of some tasks can be disadvantageous. Few reports describe such tasks and which ones are capable of automatic code migration. The aim of this study is to investigate the maintain and evolution process to identify and describe the types of maintain and evolution tasks that are essentially. Based on this study, a method (MME-MDD) that drives the developer during that proccess to guide the developer during the realization of each task, with the aim to maximize the benefits of this approach. The MME-MDD was validated by a case study and a empirical study and the method showed effective in most of tasks. In addition, studies show that using the proposed method brought a gain in the quality of the source code.
O MDD (Model-Driven Development) tem como proposta a redução da distância semântica entre os domínios problema e solução/implementação. Para isso, são utilizadas algumas ferramentas, sendo o gerador de código comumente usado neste contexto. Os geradores de código são frequentemente implementados com a utilização de templates. Para facilitar este tipo de implementação, usualmente é empregado uma Implementação de Referência (IR), favorecendo a evolução/ manutenção do software. Contudo, a IR traz a necessidade do processo de migração de código, que consiste na sincronização entre o seu código-fonte e os templates, sendo que este é responsável por 20 a 25% do tempo gasto no desenvolvimento. Na literatura não há relatos de solução automatizada, mas o grupo no qual esta pesquisa se insere vem desenvolvendo ferramentas que automatizam este processo e reduzem o tempo aplicado em algumas tarefas testadas. Porém, cada tarefa tem um desempenho diferente em relação ao tempo gasto e, portanto, a automação de algumas tarefas pode ser desvantajosa. Existem poucos relatos na literatura descrevendo tais tarefas e quais são os passos necessários para realizá-las. O objetivo deste estudo foi a investigação do processo de manutenção e evolução de sistemas baseados em MDD com a finalidade de identificar e descrever os tipos de tarefas de manutenção e evolução. Com base nesse estudo, foi elaborado o método MMEMDD que conduz o desenvolvedor durante o processo de manutenção e evolução de sistemas, visando guiar o desenvolvedor durante a realização de cada um dos tipos de tarefas, com o intuito de maximizar os benefícios da utilização dessa abordagem. O método foi validado por meio de um estudo de caso e um estudo experimental, sendo que o método se mostrou efetivo em grande parte das tarefas testadas. Além disso, os estudos apontaram que a utilização do método proposto trouxe ganho na qualidade do código-fonte.
APA, Harvard, Vancouver, ISO, and other styles
33

Matos, João Marcos Duarte. "MetaSketch OCL Interpreter." Master's thesis, Universidade da Madeira, 2008. http://hdl.handle.net/10400.13/114.

Full text
Abstract:
No contexto das tecnologias propostas pela OMG, o MOF é utilizado para definir a sintaxe de linguagens de modelação, contudo, os aspectos semânticos não podem ser capturados usando esta linguagem. A descrição dos aspectos não sintácticos é realizada com recurso à linguagem OCL. Consequentemente, para uma completa definição de uma linguagem de modelação é necessário incorporar o OCL no MOF, criando uma infra-estrutura que possui a expressividade necessária para realizar esta função. Este projecto visa complementar a ferramenta de metamodelação MetaSketch Editor, introduzindo a capacidade de executar expressões em OCL e permitindo, desta forma, a verificação semântica dos modelos construídos usando o MetaSketch Editor. A gramática da linguagem OCL adoptada está de acordo com a especificação elaborada pela OMG (2006-05-01), juntando-se algumas contribuições de trabalhos existentes sobre esta linguagem. O projecto envolveu a implementação de um parser com recurso ao sistema GOLD Parser, a implementação da standard library do OCL em C# e, por último, a implementação de uma estratégia de execução das expressões em OCL.
Orientador: Leonel Domingos Telo Nóbrega
APA, Harvard, Vancouver, ISO, and other styles
34

Xiong, Haiyan. "Providing a formal linkage between MDG and HOL based on a verified MDG system." Thesis, Middlesex University, 2002. http://eprints.mdx.ac.uk/6731/.

Full text
Abstract:
Formal verification techniques can be classified into two categories: deductive theorem proving and symbolic state enumeration. Each method has complementary advantages and disadvantages. In general, theorem provers are high reliability systems. They can be applied to the expressive formalisms that are capable of modelling complex designs such as processors. However, theorem provers use a glass-box approach. To complete a verification, it is necessary to understand the internal structure in detail. The learning curve is very steep and modeling and verifying a system is very time-consuming. In contrast, symbolic state enumeration tools use a black-box approach. When verifying a design, the user does not need to understand its internal structure. Their advantages are their speed and ease of use. But they can only be used to prove relatively simple designs and the system security is much lower than the theorem proving system. Many hybrid tools have been developed to reap the benefits of both theorem proving Systems and symbolic state enumeration Systems. Normally, the verification results from one system are translated to another system. In other words, there is a linkage between the two Systems. However, how can we ensure that this linkage can be trusted? How can we ensure the verification system itself is correct? The contribution of this thesis is that we have produced a methodology which can provide a formal linkage between a symbolic state enumeration system and a theorem proving system based on a verified symbolic state enumeration system. The methodology has been partly realized in two simplified versions of the MDG system (a symbolic state enumeration system) and the HOL system (a theorem proving system) which involves the following three steps. First, we have verified aspects of correctness of two simplified versions of the MDG system. We have made certain that the semantics of a program is preserved in those of its translated form. Secondly, we have provided a formal linkage between the MDG system and the HOL system based on importing theorems. The MDG verification results can be formally imported into HOL to form the HOL theorems. Thirdly, we have combined the translator correctness theorems with the importing theorems. This combination allows the low level MDG verification results to be imported into HOL in terms of the semantics of a high level language (MDG-HDL). We have also summarized a general method which is used to prove the existential theorem for the specification and implementation of the design. The feasibility of this approach has been demonstrated in a case study: the verification of the correctness and usability theorems of a vending machine.
APA, Harvard, Vancouver, ISO, and other styles
35

BARBOSA, Paulo Eduardo e. Silva. "MDA-VERITAS: uma arquitetura MDA estendida para transformações de sistemas concorrentes preservadoras de semântica." Universidade Federal de Campina Grande, 2011. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/1764.

Full text
Abstract:
Submitted by Johnny Rodrigues (johnnyrodrigues@ufcg.edu.br) on 2018-09-20T18:52:30Z No. of bitstreams: 1 PAULO EDUARDO E SILVA BARBOSA - TESE PPGCC 2011..pdf: 8460190 bytes, checksum: 711c8b40aaed80c81ec520880038d9b8 (MD5)
Made available in DSpace on 2018-09-20T18:52:30Z (GMT). No. of bitstreams: 1 PAULO EDUARDO E SILVA BARBOSA - TESE PPGCC 2011..pdf: 8460190 bytes, checksum: 711c8b40aaed80c81ec520880038d9b8 (MD5) Previous issue date: 2011-09-08
MDA é uma tendência de desenvolvimento de software que visa alterar o foco e os esforços dos modelos de desenvolvimento atuais. O método de implementação deixa de ser apenas a produção e código, e passa a também envolver modelos, metamodelos e transformações. Atualmente, essa abordagem tem sido diversificada com a inclusão de novos paradigmas que vão bem além do uso exclusivo dos padrões da OMG, como proposto originalmente. Contudo, a arquitetura MDA ainda sofre com a falta de formalização de alguns de seus artefatos e processos, levando a vários tipos de questionamentos. Um exemplo pertinente de questionamento se dá sobre o alto grau de ambigüidade dos modelos e transformações, originando problemas de baixa confiabilidade. Uma das conseqüências disso é o fato de que atualmente não existe uma maneira de garantir que transformações MDA sejam preservadoras de semântica, e nem que seus modelos envolvidos nas transformações sejam formais o suficiente para se permitir o uso de técnicas deverificação de equivalência, gerando críticas sobre a eficácia dessa abordagem. Esta tese de doutorado propõe lidar com esse problema, incorporando abordagens consolidadas de métodos formais na arquitetura MDA, tendo como contexto específico o desenvolvimento de software para sistemas embarcados com características de concorrência. Propomos extensões para parte da arquitetura MDA para que se possa construir modelos semânticos que representem aspectos estáticos e dinâmicos, ambos essenciais na semântica dos modelos envolvidos nas transformações e nos mecanismos de verificação de equivalência desses modelos. Com isso,obtemos a verificação de equivalência em transformações envolvendo modelos de sistemas concorrentes. Como avaliação do trabalho, provas de conceito, estudos de caso e avaliação experimental seguindo a abordagem GQM, envolvendo parcerias na academia e na indústria através de sistemas reais, foram implementados e avaliados. Verificamos equivalência entre modelos ao nível de transformações PIM-para-PIM, PSM-para-PSM e PIMpara-PSM como modelos de sistemas concorrentes descritos em redes de Petri e algumas de suas extensões.
MDA is a software development trend that aims to shift the focus and efforts of the current development methodologies. The implementation method changes from only code production to the usage of models, metamodels and transformations. Currently, this approach has been diversified with the inclusion of new paradigms that go beyond the only use of the MDA standards, as originally proposed. However, the MDA architecture still suffers from the lack of formalization of its artifacts and processes, leading to several sorts of questions. An important example of question is about the high ambiguity levels of models and transformations, originating problems of low reliability. One of the main consequences of this problem is the fact that still there is no way to ensure that MDA transformations are semantics preserving and neither the involved models are formal enough to allow the use of equivalence verification techniques, criticizing the effectiveness of this approach. This thesis proposes to deal with this problem by incorporating well consolidated formal methods techniques in the MDA architecture, having as specific context the software development for embedded systems with concurrent features. We propose extensions to part of the MDA architecture in order to construct semantic models to represent static and dynamic aspects, both essentials in the semantics of the involved models in the transformations and in the verification mechanisms of these models. With this, we achieve the verification of equivalence in transformations with models of concurrent systems. Asevaluationofthework,conceptualproofs, case studies and an experimental evaluation following the GQM approach, involving partners in the academy and industry, were implmented and evaluated. We verify models equivalence at the level of PIM-to-PIM, PSM-to-PSM and PIM-to-PSM transformations with models of concurrent systems described and inPetri nets and some of its extensions.
APA, Harvard, Vancouver, ISO, and other styles
36

Dias, Michele Carrett. "Mecanismo de ação do ácido acetilsalicílico em linhagens celulares leucêmicas MDR e não MDR." reponame:Repositório Institucional da FURG, 2007. http://repositorio.furg.br/handle/1/229.

Full text
Abstract:
Dissertação (mestrado)-Universidade Federal do Rio Grande, Programa de Pós-Graduação em Ciências Fisiológicas – Fisiologia Animal Comparada, Instituto de Ciências Biológicas, 2007.
Submitted by Barbara Milbrath (barbaramilbrath@yahoo.com.br) on 2010-10-21T22:25:55Z No. of bitstreams: 1 tese michele carrett dias.pdf: 355404 bytes, checksum: 04e1becf311e23c5da6492386ffd6c33 (MD5)
Approved for entry into archive by Jeane Lima(jeanedlb@gmail.com) on 2010-11-08T16:17:01Z (GMT) No. of bitstreams: 1 tese michele carrett dias.pdf: 355404 bytes, checksum: 04e1becf311e23c5da6492386ffd6c33 (MD5)
Made available in DSpace on 2010-11-08T16:17:01Z (GMT). No. of bitstreams: 1 tese michele carrett dias.pdf: 355404 bytes, checksum: 04e1becf311e23c5da6492386ffd6c33 (MD5) Previous issue date: 2007
As estatísticas com relação ao câncer são impiedosas. Uma em cada cinco pessoas desenvolverá uma forma de câncer em determinado momento de sua vida e ainda é importante considerar que tumores malignos já foram constatados também em plantas e em outros animais. Além das ferramentas convencionais para o tratamento do câncer, que incluem radioterapia, quimioterapia e cirurgia, outras terapias alternativas têm sido propostas, como a terapia fotodinâmica. Uma outra tentativa promissora no combate ao câncer vem sendo demonstrada com o uso do ácido acetilsalicílico (AAS). O AAS, o salicilato mais importante da família de drogas antiinflamatórias não-esteróides (NSAIDs), adquiriu popularidade em 1899, quando foram reconhecidas suas propriedades antiinflamatórias. Dados experimentais sugerem que AAS e outros membros da família de NSAIDs inibem o crescimento de células cancerosas in vitro e in vivo. É atribuído às prostaglandinas o poder de iniciar e promover o câncer por causar a proliferação celular, inibição da apoptose (morte celular programada), estimulação da angiogênese ou supressão da resposta imune. Inibir a enzima Cox está relacionado com a ini bição da produção de prostaglandinas, sugerindo assim a inibição do processo de cancerização. O AAS inibe irreversivelmente a enzima Cox em determinados tipos celulares, sendo que esta inibição é não -seletiva para ambas as isoformas da Cox; Cox-1 (isoforma constitutiva) e Cox-2 (isoforma induzida). Entretanto, estudos sugerem que o efeito antiproliferativo de AAS não está correlacionado exclusivamente com a ação inibitória da enzima Cox, já que existem relatos mostrando que NSAIDs podem induzir apoptose em células de câncer de cólon que não expressam a proteína Cox-2. Neste sentido, alguns autores demonstraram uma inibição no crescimento in vitro de células tumorais do endométrio humano pelo AAS, de uma maneira dose-dependente, sendo a apoptose um dos mecanismos envolvidos nesta resposta, mediada em parte pela “downregulation” do gene bcl-2. A redução no número de apoptoses contribui para o desenvolvimento do câncer sendo o gene bcl-2 o primeiro membro de uma família de genes que regulam este processo. Foi demonstrado que a superexpressão do gene bcl-2 aumenta a sobrevida das células tumorais protegendo-as da toxicidade causada pelos quimioterápicos, não permitindo que a apoptose ocorra. Por outro lado, a indução da apoptose é uma das ações centrais pela qual a proteína P53 exerce função na supressão do tumor. Esta proteína previne a transmissão da informação genética defeituosa para a geração das células seguintes, sendo denominada “a guardiã do genoma” e a perda desta função é um achado freqüente em câncer. A mutação do gene p53 provavelmente inativa a função supressora da proteína P53, conferindo vantagem de crescimento celular, podendo contribuir para o desenvolvimento de tumores, dentre eles, as leucemias. Também a propriedade antioxidants das NSAIDs tem si do investigada, sendo que alguns autores também atribuem a isso os efeitos antitumorais do AAS. Também é de extrema relevância considerar a possibilidade de que determinadas células tumorais podem adquirir resistência a múltiplas drogas, caracterizando o f enótipo MDR. Atualmente, a procura de novas drogas capazes de vencer o mecanismo MDR e conduzir a morte de células tumorais é de extrema importância para a terapia do câncer. Assim, criar um modelo biológico que permita estudos comparativos entre uma linhagem tumoral MDR e uma não MDR é pertinente. Com base nas informações levantadas sobre a possível atividade antitumoral do AAS, objetivamos analisar como parâmetros de estudo sua citotoxicidade (em células tumorais e não tumorais); morte celular; atividade antioxidante e alterações de expressão nos genes cox-2, bcl2 e p53, utilizando como modelos biológicos linhagens celulares normais e tumorais MDR e não MDR. AAS inibiu a proliferação celular ou induziu toxicidade nas linhagens celulares K562 e Lucena desco nsiderando o fenótipo MDR. O tratamento com AAS provocou morte, nas células K562, principalmente por apoptose inicial e por necrose, nas células Lucena. Também AAS mostrou uma capacidade antioxidante em ambas linhagens. A expressão do gene bcl-2 não apresentou diferenças significativas, considerando as células controle e tratadas com AAS, bem como as duas linhagens celulares. Para os genes p53 e cox-2, a expressão foi concentração dependente para as células K562. Já para as células Lucena, a expressão de ambos os genes foi aumentada nas menores concentrações e, para o gene p53, diminuída na maior concentração quando comparadas as células controle. Como o perfil das expressões foi similar para os genes p53 e cox-2 foi possível sugerir um fator de transcrição comum, justificando esta resposta. Por outro lado, os linfócitos normais tratados com as mesmas concentrações de AAS foram mais resistentes do que as linhagens tumorais. Os resultados deste trabalho mostraram que as duas linhagens celulares foram sensíveis ao tratamento com AAS, mas permitem sugerir que o mecanismo de ação foi diferenciado nas linhagens MDR e não MDR.
APA, Harvard, Vancouver, ISO, and other styles
37

Zheng, Zhaohua. "Intracellular delivery of MDR drugs." Thesis, University of Manchester, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.492718.

Full text
Abstract:
One of the most important mechanisms for multidrug resistance (MDR) phenotype is believed be the P-gp-mediated drug efflux. The consequent lowering of the intracellular concentrations of many commonly-used chemotherapeutic drugs, such as doxorubicin (Dox), has been addressed by P-gp inhibition via covalent attachment of MDR drugs to carriers.This project aimed to develop new drug binding peptide structures able to traverse cell membranes and to investigate their potential in the 'non'-covalent' deiivery of Dox in drug-resistant (KD30) cells.
APA, Harvard, Vancouver, ISO, and other styles
38

Allem, Luiz Emílio. "Polinômios multivariados : fatoração e MDC." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2010. http://hdl.handle.net/10183/27080.

Full text
Abstract:
Nesta tese de doutorado estudamos polinômios multivariados. Começamos fazendo uma revisão bibliográfica sobre o teorema da irredutibilidade de Hilbert. Abordamos com detalhes as demonstrações da versão clássica feita pelo próprio Hilbert e das versões efetivas feitas por Erich Kaltofen e Shuhong Gao. Desenvolvemos um novo algoritmo para fatoração de polinômios multivariados inteiros usando logaritmo discreto. Nosso método é baseado em novos tipos de reduções de polinômios multivariados para polinômios bivariados, as quais têm como principal característica manter a esparsidade do polinômio. Nosso método mostrou-se eficiente quando usado para fatorar polinômios multivariados que possuem apenas fatores esparsos e quando usado para extrair fatores esparsos de polinômios multivariados que têm fatores esparsos e densos. Terminamos essa tese trabalhando com o máximo divisor comum (mdc) de polinômios. Estudamos critérios geométricos de politopos para determinar coprimalidade entre polinômios multivariados. Desenvolvemos um novo algoritmo que trabalha em tempo polinomial (sobre o número de monômios) para detectar coprimalidade entre polinômios multivariados usando seus politopos de Newton associados. Esse método geométrico tem a vantagem de determinar a coprimalidade entre famílias de polinômios, pois podemos mudar arbitrariamente os coeficientes dos polinômios desde que certos coeficientes permaneçam não nulos. Além disso, os polinômios permanecerão coprimos sobre qualquer corpo. Terminamos mostrando como construir o mdc entre dois polinômios bivariados usando seus polígonos de Newton associados.
In this dissertation we study multivariate polynomials. We begin with a bibliographical review on the Hilbert irreducibility theorem. We cover in detail the demonstrations of the classic version due to Hilbert himself and effective versions due to Erich Kaltofen and Shuhong Gao. We developed a new algorithm for factoring multivariate integral polynomials using discrete logarithm. Our method is based on new types of reductions, from multivariate polynomias to bivariate polynomials, whose main feature is to maintain the sparsity of the polynomial. Our method has proved to be eficient when used for factoring multivariate polynomials that have only sparse factors and when used to extract sparse factors of multivariate polynomials that have sparse and dense factors. We finish this dissertation studying the greatest common divisor (gcd) of polynomials. We study geometric criteria of polytopes to determine coprimality between multivariate polynomials. We developed a new algorithm that works in polynomial time (on the number of monomials) to detect coprimality between multivariate polynomials using their associated Newton polytopes. This geometric method has the advantage of determining the coprimality between families of polynomials, since we can arbitrarily change the polynomial coeficients as long as some coeficients remain nonzero. Moreover, the coprime polynomials shall remain coprime on anyfield. We ended up showing how to build the gcd between two bivariate polynomials using their associated Newton polygons.
APA, Harvard, Vancouver, ISO, and other styles
39

Soares, Inali Wisniewski. "PM-MDA: um método para o desenvolvimento de modelos de plataforma no contexto da MDA." Universidade Tecnológica Federal do Paraná, 2012. http://repositorio.utfpr.edu.br/jspui/handle/1/716.

Full text
Abstract:
Esta tese propõe um método denominado PM-MDA para o desenvolvimento de Modelos de Plataforma (Platform Model - PM) no contexto da abordagem Model Driven Architecture (MDA). O método PM-MDA tem como foco o desenvolvimento de projetos de Software embarcado baseados em Sistemas Operacionais em Tempo Real (Real-Time Operating System - RTOS). Adicionalmente, este estudo define um perfil UML 2.0 para modelagem da aplicação e plataforma de software embarcado denominado Profile for modeling Application and Platform of Embedded Software (PROAPES) que é usado no método PM-MDA. Tal perfil define um conjunto de estereótipos para descrever genericamente Modelos de Plataforma e Modelos Independentes de Plataforma (Platform Independent Model - PIM). Além disso, são definidas extensões desse perfil, tal como o perfil PROAPESX que permite a modelagem de PMs para versões do RTOS X Real-Time Kernel e hardware associados. Além disso, o perfil PROAPES possibilita vincular um PIM a um PM, permitindo que esses modelos sejam inseridos como atributos de entrada em uma Transformação de Modelos. No contexto da MDA, esse perfil constitui-se em um metamodelo de plataforma (um metamodelo de uma família de plataformas similares) para a construção de modelos de plataforma. Desse modo, um PM é usado como parte fundamental para o desenvolvimento de software embarcado na abordagem MDA, fornecendo meios de obter independência de plataforma. Em abordagens atuais de MDA, as transformações de modelos empregam implicitamente os modelos de plataforma. Como os interesses referentes à plataforma não são separados dos interesses referentes às transformações de modelos, para cada plataforma requerida deve existir uma ou mais transformações de modelos correspondentes que são configuradas especificamente para aquela plataforma. O resultado são processos de transformações de modelos difíceis de serem automatizados. No domínio de sistemas embarcados, o uso de MDA é ainda mais importante devido à heterogeneidade de plataformas e à complexidade destes sistemas. O método PM-MDA, que faz uso do perfil PROAPES, visa sistematizar o processo de criação e disponibilização de modelos de plataforma separados do processo de transformação de modelos, possibilitando a geração de processos de transformações de modelos eficientes e adaptáveis.
This thesis proposes a method called PM-MDA for the development of Platform Models in the context of Model Driven Architecture (MDA). The PM-MDA method focuses on the development of embedded software projects based on Real-Time Operating Systems (RTOS). Additionally, this study defines a UML 2.0 Profile for Modeling Application and Platform of Embedded Software (PROAPES), which is used in the PM-MDA method. Such profile defines a set of stereotypes to generically describe Platform Models (PMs) and Platform Independent Models (PIMs). Further, extensions are defined in this profile, e.g. the PROAPESX profile, allowing the modeling of PMs into versions of the X RTOS Real-Time Kernel and associated hardware. In its turn, the PROAPES profile enables the link of a PIM to a PM, allowing these models to be entered as input attributes in a Model Transformation. In the context of MDA, this profile is a platform metamodel for building PMs, i.e., a metamodel of a family of similar platforms. In this way, a PM is used as a fundamental part in the development of embedded software in the MDA approach by providing means of obtaining platform independence. In current MDA approaches, model transformations implicitly employ PMs. As the concerns regarding the platform are not separated from the concerns related to model transformations, for each required platform there must be one or more corresponding model transformations that are configured specifically for that platform. This results in model transformation processes that are expensive and difficult to be automated. In some application domains such as embedded systems, the use of MDA is more motivating because of the heterogeneity of platforms and the complexity of these systems. The PM-MDA method, which makes use of the PROAPES profile, aims to systematize the process of creating and providing platform models separated from the model transformation process, enabling the generation of efficient and adaptable model transformations.
APA, Harvard, Vancouver, ISO, and other styles
40

Krawatzeck, Robert. "Softwaretests in der Domäne modellgetriebener BI-Systeme." Universitätsbibliothek Chemnitz, 2011. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-qucosa-70716.

Full text
Abstract:
Unternehmen agieren heute in einer hochdynamischen Umwelt, wodurch die Anforderungen an Business Intelligence-Systeme (BI-Systeme) sich stetig verändern. Durch zügiges Reagieren darauf können sich Unternehmen Wettbewerbsvorteile verschaffen. Die dazu notwendige Wandlungsfähigkeit von BI-Systemen setzt voraus, dass bereits die BI-Architektur auf Flexibilität ausgelegt ist. Die Anwendung des Paradigmas der modellgetriebenen Softwareentwicklung auf die Domäne des Data Warehouse Engineerings (DWE) kommt diesem Bedarf nach. Zudem muss neben der Agilität von BI-Systemen auch der Faktor der Korrektheitsprüfung nach vorgenommenen Änderungen betrachtet werden. Es soll untersucht werden, inwieweit sich die im modellgetriebenen DWE anfallenden Metadaten zur Unterstützung und Automatisierung von Softwaretests zur Korrektheitsprüfung nutzen lassen. Die so erzielte Verringerung des Überprüfungsaufwandes führt zu einer verbesserten Wandlungsfähigkeit von BI-Architekturen und kommt somit dem Bedarf von effizienten agilen BI-Lösungen nach.
APA, Harvard, Vancouver, ISO, and other styles
41

Araújo, Priscila Câmara de. "Uso da modelagem digital de terreno e de superfície para a estimativa do potencial de verticalização na região do campo de marte (SP)." reponame:Repositório Institucional da UnB, 2015. http://dx.doi.org/10.26512/2015.D.19831.

Full text
Abstract:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Humanas, Departamento de Geografia, Programa de Pós-Graduação em Geografia, 2015.
Submitted by Fernanda Percia França (fernandafranca@bce.unb.br) on 2016-03-31T16:36:20Z No. of bitstreams: 1 2015_PriscilaCâmaradeAraújo.pdf: 3124057 bytes, checksum: 2b221e051c7eb29ebbdd1ca4d015077d (MD5)
Approved for entry into archive by Raquel Viana(raquelviana@bce.unb.br) on 2016-04-01T20:46:24Z (GMT) No. of bitstreams: 1 2015_PriscilaCâmaradeAraújo.pdf: 3124057 bytes, checksum: 2b221e051c7eb29ebbdd1ca4d015077d (MD5)
Made available in DSpace on 2016-04-01T20:46:24Z (GMT). No. of bitstreams: 1 2015_PriscilaCâmaradeAraújo.pdf: 3124057 bytes, checksum: 2b221e051c7eb29ebbdd1ca4d015077d (MD5)
O planejamento da ordenação territorial é imprescindível para o crescimento das cidades. Os Planos Diretores têm papel fundamental no estabelecimento de diretrizes para ocupação dos espaços. A implantação do Trem de Alta Velocidade brasileiro aumenta as possibilidades de relações entre centros urbanos e impacta a ordenação do território, especialmente as áreas de implantação das estações. Nesse sentido, a modelagem digital torna-se uma ferramenta importante no planejamento da ocupação dos terrenos, tendo em vista que os Modelos Digitais de Terreno (MDT) e os Modelos Digitais de Superfície (MDS) fornecem informações do terreno e dos elementos sobre a superfície. Para o desenvolvimento deste trabalho, foi escolhida a região do Campo de Marte (SP), uma das regiões previstas para implantação de uma estação do TAV Brasil, com o objetivo testar uma metodologia de trabalho que auxilie na estimativa das potencialidades de verticalização em torno da estação, modelando uma simulação de mudança de gabarito na área. Utilizando ortofotos e o MDS produzidos em 2010/2011, pela a Empresa Paulista de Planejamento Metropolitnao (Emplasa) e, disponibilizados pela Empresa de Planejamento e Logística (EPL) do governo federal, foi feito o mapeamento das ruas e edificações da área de influência indireta do Campo de Marte e a construção do MDT. Foi calculado o volume de ocupação atual da área pela diferença dos valores de elevação obtidos pelo MDS e MDT para dentro da área da estrutura predial multiplicado pela área desta estrutura. Os resultados obtidos foram a classificação das edificações atuais e a verificação de que a área de estudo ainda possui um grande potencial de verticalização permitido pelo plano diretor (cerca 5 milhões de metros cúbicos).
The growth of cities is dynamic and constant and is essential to planning the territorial ordination. The Directive Plans play a fundamental role in the establishment of guidelines and strategies for the space occupation. The implementation of the brazilian High Speed Train increase the chance of relations between urban centers and impacts land use planning. In this sense, digital modeling becomes an important tool in planning the occupation of the land, given that the Digital Terrain Models (DTM) and the Surface Digital Models (MDS) provide the land information and evidence on the surface .To develop this work, it was chosen the region of Campo de Marte (SP), one of the areas provided for implementation of a station TAV Brazil, aiming to predict the potential for occupation around the station. Using orthophotos and MDS produced in 2010/2011 by the Paulista Company of Metropolitan Planning (Emplasa) and made available by the federal government Company for Planning and Logistics (EPL), has done the mapping of streets and buildings in the area of indirect influence of the Campo de Marte and the construction of the MDT. We calculated the volume of the current occupation of the area by the difference in elevation values obtained by MDS and MDT and into the area of the building structure multiplied by the area of this structure. The results were the classification of current buildings and verification that the study area still has great potential for vertical expansion allowed by the directiver plan (about 5 million cubic meters), and when simulated (to a height of 87 meters) the triples value.
APA, Harvard, Vancouver, ISO, and other styles
42

Anger, Sabrina. "Potential and challenges of compound semiconductor characterization by application of non-contacting characterization techniques." Doctoral thesis, Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2016. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-201487.

Full text
Abstract:
Trotz der im Vergleich zu Silizium überragenden elektronischen Eigenschaften von Verbindungshalbleitern, ist die Leistung der daraus gefertigten elektrischen Bauelemente aufgrund der vorhandenen, die elektronischen Materialeigenschaften beeinflussenden Defekte nach wie vor begrenzt. Die vorliegende Arbeit trägt dazu bei, das bestehende ökonomische Interesse an einem besseren Verständnis der die Bauelementeleistung limitierenden Defekte zu befriedigen, indem sie die Auswirkungen dieser Defekte auf die elektronischen und optischen Materialeigenschaften von Indiumphosphid (InP) und Siliziumkarbid (SiC) aufzeigt. Zur Klärung der Effekte finden in der Arbeit sich ergänzende elektrische und optische Charakterisierungsmethoden Anwendung, von denen die meisten kontaktlos und zerstörungsfrei arbeiten und sich daher prinzipiell auch für Routineanalysen eignen. Die erzielten Ergebnisse bestätigen und ergänzen Literaturdaten zum Defektinventar in InP und SiC nutzbringend. So wird insbesondere das Potential der elektrischen Charakterisierung mittels MDP und MD-PICTS, welche in der Arbeit erstmals für die Defektcharakterisierung von InP und SiC eingesetzt wurden, nachgewiesen. Die experimentellen Studien werden dabei bedarfsorientiert durch eine theoretische Betrachtung des entsprechenden Signalentstehungsmechanismuses ergänzt
Although the electronic properties of compound semiconductors exceed those of Silicon, the performance of respective electronic devices still is limited. This is due to the presence of various growth-induced defects in compound semiconductors. In order to satisfy the economic demand of an improved insight into limiting defects this thesis contributes to a better understanding of material inherent defects in commonly used Indium Phosphide (InP) and Silicon Carbide (SiC) by revealing their effects on electronic and optical material properties. On that account various complementary electrical and optical characterization techniques have been applied to both materials. Most of these techniques are non-contacting and non-destructive. So, in principle they are qualified for routine application. Characterization results that are obtained with these techniques are shown to either confirm published results concerning defects in InP and SiC or beneficially complement them. Thus, in particular the potential of electrical characterization by MDP and MD-PICTS measurements is proofed. Both techniques have been applied for the first time for defect characterization of InP and SiC during these studies. The respective experiments are complemented by a theoretical consideration of the corresponding signal development mechanism in order to develop an explanation approach for occasionally occurring experimental imperfection also arising during silicon characterization from time to time
APA, Harvard, Vancouver, ISO, and other styles
43

Becker, Steffen. "Coupled model transformations for QoS enabled component-based software design." Karlsruhe Univ.-Verl. Karlsruhe, 2008. http://d-nb.info/990667650/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Halvarsson, Sören. "Manufacture of straw MDF and fibreboards." Doctoral thesis, Mittuniversitetet, Institutionen för naturvetenskap, teknik och matematik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-11732.

Full text
Abstract:
The purpose of this thesis was to develop an economical, sustainable, and environmentally friendly straw Medium Density Fibreboard (MDF) process, capable of full-scale manufacturing and to produce MDF of requested quality. The investigated straw was based on wheat (Triticum aestivum L.) and rice (Oryzae sativa L.). In this thesis three different methods were taken for manufacture of straw MDF; (A) wheat-straw fibre was blowline blended with melamine-modified urea-formaldehyde (MUF), (B) rice-straw fibre was mixed with methylene diphenyl diisocyanate (MDI) in a resin drum-blender, and (C) wheat-straw fibre was activated in the blowline by the addition of Fenton’s reagent (H2O2/Fe2+) for production of non-resin MDF panels.  The MUF/wheat straw MDF panels were approved according to the requirements of the EN standard for MDF (EN 622-5, 2006). The MDI/rice-straw MDF panels were approved according to requirements of the standard for MDF of the American National Standard Institute (ANSI A208.2-2002). The non-resin wheat-straw panels showed mediocre MDF panel properties and were not approved according to the requirements in the MDF standard. The dry process for wood-based MDF was modified for production of straw MDF. The straw MDF process was divided into seven main process steps. 1.       Size-reduction (hammer-milling) and screening of straw 2.       Wetting and heating of straw 3.       Defibration 4.       Resination of straw fibre 5.       Mat forming 6.       Pre-pressing 7.       Hot-pressing       The primary results were that the straw MDF process was capable of providing satisfactory straw MDF panels based on different types of straw species and adhesives. Moreover, the straw MDF process was performed in pilot-plant scale and demonstrated as a suitable method for producing straw MDF from straw bales to finished straw MDF panels. In the environmental perspective the agricultural straw-waste is a suitable source for producing MDF to avoid open field burning and to capture carbon dioxide (CO2), the biological sink for extended time into MDF panels, instead of converting straw directly into bio energy or applying straw fibre a few times as recycled paper. Additionally, the straw MDF panels can be recycled or converted to energy after utilization. A relationship between water retention value (WRV) of resinated straw fibres, the thickness swelling of corresponding straw MDF panels, and the amount of applied adhesive was determined. WRV of the straw fibre increased and the TS of straw MDF declined as a function of the resin content. The empirical models developed were of acceptable significance and the R2 values were 0.69 (WRV) and 0.75 (TS), respectively. Reduced thickness swelling of MDF as the resin content is increased is well-known. The increase of WRV as a function of added polymers is not completely established within the science of fibre swelling. Fortunately, more fundamental research can be initiated and likely a simple method for prediction of thickness swelling of MDF by analysis of the dried and resinated MDF fibres is possible.
Syftet med denna avhandling var att lägga grunden för en ekonomisk, hållbar och miljövänlig MDF process för halmråvara, kapabel för fullskalig produktion av MDF och goda skivegenskaper. Framställningen av MDF skivor utgick från halm av vete (Triticum aestivum L.) och ris (Oryzae sativa L.). Tre olika metoder användes för att producera MDF av halm; (A) fibrer av vetehalm belimmades i blåsledning med ett melaminmodifierat urea-formaldehydlim (MUF), (B) fibrer av rishalm belimmades i en limblandare med metylen difenyl diisocyanate (MDI), (C) Limfria MDF skivor av vetehalm framställdes med aktivering av fibrer genom tillsats av Fenton´s reagens (H2O2/Fe2+) i blåsledning utan någon tillsats av syntetiskt lim. Sammanfattningsvis kan det understrykas att framställda MDF-skivor av MUF/vetehalm var godkända enligt standard för MDF (EN 622-5, 2006). Dessutom var framställda MDF skivor av MDI/rishalm också godkända enligt krav i standard för MDF ”American National Standard Institute” (ANSI A2008.2-2002). Limfria vetehalmskivor visade på måttliga skivegenskaper och klarade inte kraven i MDF standard.   Fiberframställningsprocessen för MDF modifierades till en produktion utgående från halm. MDF processen för halm delades upp i sju primära processoperationer.   (1)            Storleksreducering och sållning av halm (2)            Vätning och uppvärmning av halm (3)            Defibrering (4)            Belimning av halmfiber (5)            Mattformning (6)            Förpressning (7)            Pressning   De viktigaste resultaten från denna studie är att MDF av halm kunde produceras utgående från olika typer av halmsorter och lim. Dessutom utfördes MDF-processen i pilotskala och visade på en lämplig metod för framställning av MDF-skivor från halmbalar till färdiga halmfiberskivor. Det miljömässiga perspektivet på att använda jordbruksavfall till framställning av halmskivor är att undvika förbränning av halm ute på fältet, men det är även möjligt att binda koldioxid (CO2) i halmskivor under längre tid än att omsätta halmråvaran omedelbart som bioenergi eller använda halmfiber som returpapper några få gånger. Dessutom kan MDF återanvändas eller bli omsatt till energi efter användning.   Ett förhållande mellan ”water retention value” (WRV), av belimmade halmfiber, tjocklekssvällning för motsvarande MDF av halmskivor och mängden av tillsatt lim vid olika nivåer har undersökts. Med ökande limhalt tilltog WRV fibersvällning, vidare minskade tjocklekssvällning för motsvarande MDF skivor. De framtagna empiriska modellerna var godtagbara och beräknade R2 värden var 0.69 (WRV) och 0.75 (TS). Minskad tjocklekssvällning med ökad limhalt är dokumenterad sen tidigare. Ökad fibersvällning WRV vid tillsats av polymerer (limmer) är inte fullständigt etablerad inom vetenskapen för fibersvällning. Lyckligtvis kan grundläggande forskning initieras och sannolikt föreligger en enkel metod för att prediktera tjocklekssvällning av MDF genom analyser av torkade och belimmad MDF fiber.
APA, Harvard, Vancouver, ISO, and other styles
45

Škultinas, Tomas. "MDA panaudojimo programinės įrangos kūrimui tyrimas." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2005. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2005~D_20050524_163850-89727.

Full text
Abstract:
IT industry all time is looking for ways to improve software development productivity as well as the quality and longevity of the software that it creates. OMG announced Model Driven Architecture as its strategic direction. It is software development methodology that provides new viewpoint in software development process. The modeling of problem domain and model transformation are key elements of MDA architecture and they are analyzed in this work using OMG specifications and other resources. The purpose of this work is to evaluate benefits of MDA framework in software development process. The new MDA framework is developed according to the results of MDA architecture analysis. Experimental usage of new MDA framework concentrates on productivity of software development process, automation of repeated tasks and required skill set of application developers.
APA, Harvard, Vancouver, ISO, and other styles
46

Northcott, Christopher Barry. "The development of MI5 1909-1918." Thesis, University of Bedfordshire, 2005. http://hdl.handle.net/10547/346882.

Full text
Abstract:
The 1909-1918 era can be regarded as the formative years of MI5, as it developed from a small counter-espionage bureau into an established security intelligence agency. MI5 had two main roles during this period; counter-espionage, and advising the War Office on how to deal with the police and the civilian population, particularly aliens. Most of the existing literature tends to focus on the development of MI5 as a whole and pays little attention to the six individual branches that constituted MI5 by the armistice. Recently released MI5 documents in The National Archives (rnA) make it possible to examine MI5 at the micro level and set out the intimate workings of its six branches. The study examines the evolution of MI5 from its formation in October 1909 to the end of the First World War in November 1918, paying particular attention to three questions. First, what did a map of the structure of the MI5 organisation look like and "how" did it develop during these years? Secondly, "why" did it develop as it did? Thirdly, "how effective" was MI5 throughout this period? MI5 began as a one-man affair in 1909, tasked with the limited remit of ascertaining the extent of Gemlan espionage in Britain and an uncertain future. By the armistice MI5's role had expanded considerably and it had begun to develop into an established security intelligence agency, with 844 personnel spread over six branches covering the investigation of espionage, prevention, records, ports and travellers, overseas, and alien workers. This study suggests that the main driver of these developments, if one key factor can be singled out, was the changing perception of the nature of the threat posed by German espionage. However, because some within official circles equated all forms of opposition to Government policy with support for Germany, increasing attention also began to be paid to the possibility that industrial umest, pacifists and others who opposed the Government might actually be being directed by a German "hidden hand". From 1917 onwards MI5's development was driven by a conviction that it had defeated German espionage, such that Germany had switched its efforts to promoting Bolshevism and other forms of umest in order to undermine British society. However, MI5's activities were restricted to investigating if there was really any enemy influence behind such things, while Special Branch was to focus on labour unrest generally. This study makes an original and useful contribution to knowledge in three noteworthy respects. First, it sets out probably the most detailed description of MI5's organisational structure available. Secondly, it poses the stimulating question of "how to measure" the effectiveness of a counter-espionage agency? Thirdly, it suggests that, contrary to claims that Vemon Kell was an "empire builder" who wanted a greater role in labour intelligence, Kell felt it appropriate that MI5's activities should be restricted to the investigation of cases of peace propaganda and sedition that arose from enemy activities and did not actually want MI5 to assume a broader role in labour intelligence at that time.
APA, Harvard, Vancouver, ISO, and other styles
47

Moore, Jessica D., and Richard D. Stiers. "iNET MDL from a User Perspective." International Foundation for Telemetering, 2014. http://hdl.handle.net/10150/577441.

Full text
Abstract:
ITC/USA 2014 Conference Proceedings / The Fiftieth Annual International Telemetering Conference and Technical Exhibition / October 20-23, 2014 / Town and Country Resort & Convention Center, San Diego, CA
During concept development of a new core analog acquisition system, Boeing Flight Test identified a need for a set of more efficient and cost effective Test System configuration and setup tools, preferably supported by an industry standard. Like most big test organizations we support years and years of legacy tools. Currently all new functions are required to be hosted within the legacy environment. Legacy environments tend to be big, slow, and expensive to update and maintain. In searching for a better way to do business, we evaluated iNET/MDL, IHAL, and XidML standards. For a variety of reasons which will be discussed in this paper, we have chosen to focus on the iNET MDL standard as the means for producing a new vendoragnostic, simpler and more cost-effective system interface. Our initial evaluation uncovered several gaps in the data structure and concept of operations. The iNET community acknowledged the gaps and encouraged us to work with them to enhance the standard. The iNET MDL concept of operations also represents a significant operational paradigm shift. Through an industry users group, we have been working to refine and enhance the data structures and concept of operations. This paper will describe the journey from a demonstration environment to an enterprise implementation of MDL as it relates to data acquisition setup and control.
APA, Harvard, Vancouver, ISO, and other styles
48

Boni, Robson Aparecido dos Santos. "Regeneração nervosa periférica em camundongos mdx." [s.n.], 2010. http://repositorio.unicamp.br/jspui/handle/REPOSIP/317586.

Full text
Abstract:
Orientador: Humberto Santo Neto
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Biologia
Made available in DSpace on 2018-08-16T20:39:51Z (GMT). No. of bitstreams: 1 Boni_RobsonAparecidodosSantos_M.pdf: 2053465 bytes, checksum: ae91ac2421ee67ae72964df4c510de03 (MD5) Previous issue date: 2010
Resumo: A distrofina é uma proteína de membrana ligada ao citoesqueleto da matriz extracelular das fibras musculares (esqueléticas e cardíacas) e nervosas. Enquanto que o papel da distrofina e os efeitos de sua ausência são bem conhecidos nos camundongos mdx (modelo animal da distrofia muscular de Duchenne), sabe-se pouco sobre sua função em nervos periféricos. A distrofina parece ser importante para o crescimento axonal, no sistema trigeminal a sua falta leva a defasciculação do sistema olfatório de camundongos. Em casos de ausência de distrofina a eliminação sináptica ocorre precocemente à expressão de moléculas pré-sinapticas é reduzida e a habilidade das células de Schwann terminais de guiarem as fibras para reinervação muscular fica comprometida. Estes achados sugerem que a distrofina possui papel essencial na regeneração nervosa periférica. Para testar esta hipótese nós examinamos a regeneração nervosa em camundongos mdx. Foram utilizados camundongos adultos machos da linhagem mdx e camundongos da linhagem C57BL/10 como controle, eles foram anestesiados com mistura de cloridrato de cetamina e cloridrato de xilazina. O nervo isquiático direito foi exposto e esmagado com uso de uma pinça fina sem ranhuras. Após o evento cirúrgico e cessado o efeito do anestésico os animais foram acondicionados em gaiolas e submetidos a regime hídrico e alimentar "ad libitum" com ciclo fotoperiódico claro/escuro de 12 horas. Destes, um grupo foi tratado com injeções intraperitoneais de L-arginina (6mg/kg) diluído em água bidestilada. Seis e 21 dias após o esmagamento os animais foram anestesiados e perfundidos por via intracardíaca com solução de Karnovsky. O nervo isquiático foi removido e imerso em fixador por 24 horas e pós-fixado em tetróxido de ósmio 1% por 2 horas. Posteriormente foram inclusos em blocos e feitos cortes semifinos que foram corados com azul de toluidina 0,5%. As secções foram analisadas em fotomicroscópio NIKON ECLIPSE E-400 (NIKON, Inc.). A densidade dos axônios com mielina (6dias) e axônios em regeneração (21 dias) foram contados. Nossos resultados demonstraram que a densidade de axônios com mielina foi significantemente maior no mdx em comparação ao C57BL/10 (312±10,2/mm2 versus 213,8±4,6/mm2). A densidade de macrófagos e células de Schwann com restos de mielina foram respectivamente 77,6±5,6/mm2 e 148±2,4/mm2. Após 21 dias, todos os parâmetros (diâmetro do axônio, espessura da bainha de mielina e número de axônios regenerados) foram significantemente menores nos camundongos mdx. Nos camundongos tratados com L-arginina os parâmetros foram semelhantes ao controle não havendo diferença estatística. Os resultados mostraram que o papel da distrofina e do óxido nítrico são de fundamental importância na regeneração nervosa periférica.
Abstract: Dystrophin is a membrane protein that links the cytoskeleton to the extracellular matrix in skeletal and cardiac muscle fibers and in the nervous system. While the role of dystrophin is well established and the effects of dystrophin loss, as it occurs in the mdx mice model of Duchenne muscular dystrophy, have been widely examined in muscle fibers, less is known about dystrophin function in peripheral nerves. It seems to be important for axonal outgrowth in the trigeminal system and the lack of dystrophin leads to nerve defasciculation in the mouse olfactory system. In the absence of dystrophin, synapse elimination occurs earlier, expression of presynaptic molecules is reduced and the ability of terminal Schwann cells to guide reinnervation of muscle fibers is impaired. These findings suggest a potential role of dystrophin in the regeneration of peripheral nerves. To test this hypothesis we examined nerve regeneration in mdx mice. Adult male mdx and control C57Bl/10 mice were anesthetized with a mixture of ketamine hydrochloride and thyazine hydrochloride. Right sciatic nerve was exposed at mid thigh and crushed with a fine forceps. The wound was closed and mice were kept with food and water ad libitum in a light-dark cycle of 12hs. One group was treated with L-arginine (6mg/kg) in drinking water. Six and 21 days after nerve crush mice were anesthetized and perfused intracardiacally with Karnovsky solution. Sciatic nerves were excised and fragments were immersed in the same fixative for 24 hours and post-fixed in 1% osmium tetroxide for 2 hours. They were conventionally processed for electron microscopy. Transverse semithin sections were stained with 0.5% toluidine blue. Sections were viewed under a Nikon Eclipse E-400 (Nikon, Inc.) microscope. The density of axons with myelin breakdown, of Schwann cells/macrophages filled with myelin debris (6 days) and of myelinated regenerating axons (3 weeks) were directly counted. Our results demonstrated that the density of axons displaying myelin breakdown was significantly higher in mdx than in crushed C57Bl/10 (312±10,2/mm2 versus 213,8±4,6/mm2) the density of macrophages and Schwann cells with myelin debris were respectively 77,6±5,6/mm2 and 148±2,4/mm2. After 21 days, all parameters (axonal diameter, myelin sheath thickness, number of regenerating axons) were significantly lower in mdx mice. When mdx was treated with L-arginine such parameters were not significantly different from control. These results demonstrated that dystrophin plays a role on nerve regeneration and that nitric oxide may be an important factor in that.
Mestrado
Anatomia
Mestre em Biologia Celular e Estrutural
APA, Harvard, Vancouver, ISO, and other styles
49

Leocadio, Marcelo Augusto. "Código MDS com a métrica POSET." Universidade Federal de Viçosa, 2013. http://locus.ufv.br/handle/123456789/4927.

Full text
Abstract:
Made available in DSpace on 2015-03-26T13:45:36Z (GMT). No. of bitstreams: 1 texto completo.pdf: 1755688 bytes, checksum: 33e268f82618cf29e2d1fa6df5c6fa6c (MD5) Previous issue date: 2013-07-30
Fundação de Amparo a Pesquisa do Estado de Minas Gerais
A poset metric is the generalization of the Hamming metric. In this work we make a detailed study of poset spaces, hierarchy of I -weights and I -distribution of P P weights, emphasizing the non-degenerate poset codes. We verify the duality relation between the hierarchy weights of poset code and its dual. In the sequel two new parameters are defined to a class of poset codes non-degenerate with dual code is too non-degenerate in the environment. As a result enunciated in the Minimality Theorem, the Variance Theorem and the Minimality Identity in the poset spaces.
Uma generalização da métrica de Hamming é a métrica poset. Faremos um estudo detalhado dos espaços poset, hierarquia de I-pesos e a I-distribuição de pesos, dando ênfase aos códigos poset não degenerados. Verificamos a relação de dualidade poset entre as hierarquias de um código e seu dual. Definimos dois novos parâmetros para a classe de códigos dualmente não degenerados no ambiente poset. Como consequência, enunciamos e mostramos o Teorema da Minimalidade, o Teorema da e Variância e a Identidade de Minimalidades no espaço poset.
APA, Harvard, Vancouver, ISO, and other styles
50

Michl, Zbyněk. "T-Mobile MDA II v Linuxu." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2009. http://www.nusl.cz/ntk/nusl-236630.

Full text
Abstract:
MSc. thesis deals with mobile digital assistant T-Mobile MDA II running Linux operating system. The first part presents device identification and parameters' specification of the MDA II. The second part focuses on selection of GNU distribution with Linux bootloader and Linux kernel support comparison. The subject of the last part is MDA II component code implementation and its merging into Linux kernel.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography