To see the other types of publications on this topic, follow the link: Open tools.

Dissertations / Theses on the topic 'Open tools'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Open tools.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Haustein, Mario. "Open-Source-Tools für Amateurastronomen." Universitätsbibliothek Chemnitz, 2009. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-200901933.

Full text
Abstract:
Bekanntlich gibt es für fast jede erdenkliche Aufgabe ein Open-Source-Programm. Bis auf einige Ausnahmen ist das Angebot an Software für Amateurastronomen allerdings recht unübersichtlich. Der Vortrag soll deshalb eine Übersicht über verschiedene Software-Projekte geben, die für Astrofotografie und astronomische Beobachtungen nützlich sind. Einfache Demonstrationen werden die Einsatzmöglichkeiten dieser Programme verdeutlichen und sollen dazu anregen, den Nachtimmel selbst mit Feldstecher und Kamera zu erkunden.
APA, Harvard, Vancouver, ISO, and other styles
2

Stull, Michael D. "Open source tools : applications and implications." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2000. http://handle.dtic.mil/100.2/ADA386593.

Full text
Abstract:
Thesis (M.S. in Defense Analysis (Information Operations)) Naval Postgraduate School, Dec. 2000.<br>Thesis advisor(s): Arquilla, John ; Shimeall, Timothy. "December 2000." Includes bibliographical references (p. 87-92). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Hailing. "Comparison of Open Source License Scanning Tools." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-97921.

Full text
Abstract:
We aim to determine the features of four popular FOSS scanning tools, FOSSology,FOSSA, FOSSID(SCAS), and Black Duck, thereby providing references for users tochoose a proper tool for performing open-source license compliance in their projects.The sanity tests firstly verify the license detection function by using the above tools toscan the same project. We consider the number of found licenses and scanned sizes asmetrics of their accuracy. Then we generate testing samples in different programminglanguages and sizes for further comparing the scanning efficiency. The experiment datademonstrate that each tool would fit different user requirements. Thus this project couldbe considered as a definitive user guide.
APA, Harvard, Vancouver, ISO, and other styles
4

Königsson, Niklas. "Limitations of static analysis tools : An evaluation of open source tools for C." Thesis, Umeå universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-155299.

Full text
Abstract:
This paper contains an evaluation of common open source static analysistools available for C. The tools algorithms are examined and measured in a test environment designed for such benchmarks to present their strengths and weaknesses. The examined tools represent different approaches to static analysis to get a good coverage of the algorithms that are commonly used. The test environment shows how many bugs that are correctly reportedby the tools, and also how many falsely reported bug they produce. The revealed strengths and weaknesses are discussed in relation to the tools algorithms to gain a deeper understanding of their limitations.
APA, Harvard, Vancouver, ISO, and other styles
5

Wiggins, John Sterling. "Design and specification of a PC-based, open architecture environment controller." Thesis, Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/17299.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Fridell, Emil. "Architectural Rules Conformance with ArCon and Open-SourceModeling Tools." Thesis, Linköpings universitet, Programvara och system, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-78888.

Full text
Abstract:
In software development it is often crucial that the system implementationfollows the architecture dened through design patterns and a constraint set.In Model-Driven development most artefacts are created using models, butthe architectural design rules is one area where no standard to model therules exists. ArCon, Architecture Conformance Checker, is a tool to checkconformance of architectural design rules on a system model, dened in UML,that implements the system or application. The architectural design rules aredened in a UML model but with a specic meaning, dierent from standardUML, proposed by the authors of ArCon. Within this thesis ArCon wasextended to be able to check models created by the Open-Source modelingtool Papyrus, and integrated as a plugin on the Eclipse platform. The methodused by ArCon, to dene architectural rules, was also given a short evaluationduring the project to get a hint of its potential and future use. The case-studyshowed some problems and potential improvements of the used implementationof ArCon and its supported method.
APA, Harvard, Vancouver, ISO, and other styles
7

Teltz, Richard W. "Open architecture control for intelligent machining systems." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape11/PQDD_0006/NQ42883.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Joshi, Hrishikesh S. "Open architecture for embedding VR based mechanical tools in CAD." Online access for everyone, 2006. http://www.dissertations.wsu.edu/Thesis/Fall2006/h_joshi_010507.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Xia, Ziqi. "Comparative Study of Open-Source Performance Testing tools versus OMEXUS." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-302650.

Full text
Abstract:
With the development of service digitalization and the increased adoption of web services, modern large-scale software systems often need to support a large volume of concurrent transactions. Therefore, performance testing focused on evaluating the performance of systems under workload has gained greater attention in current software development. Although there are many performance testing tools available for providing assistance in load generation, there is a lack of a systematic evaluation process to provide guidance and parameters for tool selection for a specific domain. Focusing on business operations as the specific domain and the Nasdaq Central Securities Depository (NCSD) system as an example of large-scale software systems, this thesis explores opportunities and challenges of existing open- source performance testing tools as measured by usability and feasibility metrics. The thesis presents an approach to evaluate performance testing tools concerning requirements from the business domain and the system under test. This approach consists of a user study conducted with four quality assurance experts discussing general performance metrics and specific analytical needs. The outcome of the user study provided the assessment metrics for a comparative experimental evaluation of three open-source performance testing tools (JMeter, Locust, and Gatling) with a realistic test scenario. These three tools were evaluated in terms of their affordance and limitations in presenting analytical details of performance metrics, efficiency of load generation, and ability to implement realistic load models. The research shows that the user study with potential tool users provided a clear direction when evaluating the usability of the three tools. Additionally, the realistic test case was sufficient to reveal each tool’s capability to achieve the same scale of performance as the Nasdaq’s in-house testing tool OMEXUS and provide additional value with realistic simulation of user population and user behavior during performance testing with regard to the specified requirements.<br>Med utvecklingen av tjänste-digitalisering och ökad användning av webbtjänster behöver moderna storskaliga mjukvarusystem ofta stödja en stor mängd samtidiga transaktioner. Prestandatestning med fokus på att utvärdera prestanda för system under arbetsbelastning har därför fått större uppmärksamhet i den aktuella programvaru utvecklingen. Även om det finns många verktyg för prestandatestning tillgängliga för att ge hjälp i belastnings generering, saknas det en systematisk utvärderingsprocess för att ge vägledning och parametrar för verktygsval för en viss domän. Med fokus på affärsverksamhet som den specifika domänen och Nasdaq Central Securities Depository (NCSD) -systemet, som ett exempel på storskaliga mjukvarusystem, utforskar denna avhandling möjligheter och utmaningar med befintliga verktyg för prestandatestning med öppen källkod mätt med användbarhets- och genomförbarhet mått. Avhandlingen presenterar ett tillvägagångssätt för att utvärdera prestandatestverktyg avseende krav från företagsdomänen och det system som testas. Detta tillvägagångssätt består av en användarstudie utförd med fyra kvalitetssäkringsexperter som diskuterar allmänna prestandamått och specifika analytiska behov. Resultatet av användarstudien gav bedömningsmåtten för en jämförande experimentell utvärdering av tre verktyg för prestandatestning med öppen källkod (JMeter, Locust och Gatling) med ett realistiskt testscenario. Dessa tre verktyg utvärderades i termer av deras överkomlighet och begränsningar när det gäller att presentera analytiska detaljer om prestandamått, effektiviteten i lastgenereringen och förmågan att implementera realistiska belastningsmodeller. Forskningen visar att användarstudien med potentiella verktygsanvändare gav en tydlig riktning vid utvärdering av användbarheten av de tre verktygen. Dessutom var det realistiska testfallet tillräckligt för att avslöja varje verktygs förmåga att uppnå samma skala av prestanda som Nasdaqs interna testverktyg OMEXUS och ge ytterligare värde med realistisk simulering av användarpopulation och användarbeteende under prestandatestning med avseende på de angivna kraven.
APA, Harvard, Vancouver, ISO, and other styles
10

Marlow, Gregory. "Week 07, Video 02: Modeling Tools." Digital Commons @ East Tennessee State University, 2020. https://dc.etsu.edu/digital-animation-videos-oer/51.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Maechel, Lars. "Development of a location-based taxi service : using open-source tools." Thesis, Mittuniversitetet, Avdelningen för data- och systemvetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-30228.

Full text
Abstract:
The aim of this project is to develop a component able to retrieve taxi providers in Sweden, based on input of coordinates and a search radius. The result should be available in a well-structured and accessible way through an RESTful web service. The study is conducted based on a customer inquiry stating that the component should be built using open-source tools and be developed in such a way that allows the component, or parts of the component, to be integrated in a larger system. Therefore, a preliminary study was conducted in order to find suitable open-source tools that are able to solve the specific customer requirements. The component uses an ad hoc company search engine to retrieve the taxi providers, contact information and coordinates. Additional information needed for determining the correctness of the taxi providers are retrieved from external resources and a filtering process is then performed before they are saved in persistent store. The project is successful in solving the main purpose and most of the customer requirements, while the RESTful service is unable too fully meet the requirement stating that the component should be able to handle multiple concurrent clients while still maintain responsiveness. This, is mainly due to the decision not to use an ad hoc framework in handling geospatial data structures and calculations and was a trade-off to ensure re-usability and integration of the component in a larger system.<br>Avsikten med denna studie är att bygga en komponent som kan returnera taxiföretag i Sverige baserat på koordinater och en sökradie. Resultatet skall presenteras för användare eller andra tjänster på ett välstrukturerat och tillgängligt sätt via en RESTbaserat webbtjänst. Projektet är ett resultat av en kundförfrågan i vilken det även specificeras att komponenten skall använda sig av data eller verktyg som är fritt tillgängliga och att den skall vara byggd på ett sådant sätt att det går att använda hela, eller delar av, komponenten i ett större system. En kortare förstudie genomfördes med syfte att hitta lämpliga verktyg som bygger på öppen källkod för att lösa de specifika krav kunden ställt på komponenten. Komponenten använder sig av en sökmotor för företag-sök där resultaten görs tillgängliga via ett RESTbaserat API. Varje företag genomgår sedan en filtreringsprocess innan de sparas i databas för att avgöra om det uppfyller de kriterier som är uppsatta. Denna filtrering baseras på information som inhämtats genom en automatiserad process. Denna studie har varit framgångsrik i avseendet att bygga en komponent som kan returnera företag inom ett visst område baserat på koordinater och en given sökradie. Den uppfyller även de allra flesta av de krav kunden har satt upp, med undantag för det krav som efterfrågar en responsiv hantering av många samtidiga användare. Detta beror främst på valet att inte använda sig av ett ramverk avsett för hantering av geospatial data. Detta var en avvägning som gjordes för att försäkra sig om att komponenten skulle vara återanvändbar i ett större system.
APA, Harvard, Vancouver, ISO, and other styles
12

Motsumi, Tebagano Valerie. "Designing and prototyping WebRTC and IMS integration using open source tools." Thesis, Rhodes University, 2018. http://hdl.handle.net/10962/63245.

Full text
Abstract:
WebRTC, or Web Real-time Communications, is a collection of web standards that detail the mechanisms, architectures and protocols that work together to deliver real-time multimedia services to the web browser. It represents a significant shift from the historical approach of using browser plugins, which over time, have proven cumbersome and problematic. Furthermore, it adopts various Internet standards in areas such as identity management, peer-to-peer connectivity, data exchange and media encoding, to provide a system that is truly open and interoperable. Given that WebRTC enables the delivery of multimedia content to any Internet Protocol (IP)-enabled device capable of hosting a web browser, this technology could potentially be used and deployed over millions of smartphones, tablets and personal computers worldwide. This service and device convergence remains an important goal of telecommunication network operators who seek to enable it through a converged network that is based on the IP Multimedia Subsystem (IMS). IMS is an IP-based subsystem that sits at the core of a modern telecommunication network and acts as the main routing substrate for media services and applications such as those that WebRTC realises. The combination of WebRTC and IMS represents an attractive coupling, and as such, a protracted investigation could help to answer important questions around the technical challenges that are involved in their integration, and the merits of various design alternatives that present themselves. This thesis is the result of such an investigation and culminates in the presentation of a detailed architectural model that is validated with a prototypical implementation in an open source testbed. The model is built on six requirements which emerge from an analysis of the literature, including previous interventions in IMS networks and a key technical report on design alternatives. Furthermore, this thesis argues that the client architecture requires support for web-oriented signalling, identity and call handling techniques leading to a potential for IMS networks to natively support these techniques as operator networks continue to grow and develop. The proposed model advocates the use of SIP over WebSockets for signalling and DTLS-SRTP for media to enable one-to-one communication and can be extended through additional functions resulting in a modular architecture. The model was implemented using open source tools which were assembled to create an experimental network testbed, and tests were conducted demonstrating successful cross domain communications under various conditions. The thesis has a strong focus on enabling ordinary software developers to assemble a prototypical network such as the one that was assembled and aims to enable experimentation in application use cases for integrated environments.
APA, Harvard, Vancouver, ISO, and other styles
13

PALLONI, ROBERTO. "Open Data Analytics - Advanced methods, Tools and Visualizations for Policy Making." Doctoral thesis, Università Politecnica delle Marche, 2019. http://hdl.handle.net/11566/263468.

Full text
Abstract:
La ricerca discussa in questa tesi è incentrata sullo sviluppo e l'applicazione di nuove metodologie per la raccolta, l'elaborazione e la visualizzazione di opendata per il supporto alle decisioni nell’ambito delle amministrazioni pubbliche. La ricerca si concentra sull'efficacia dei fondi strutturali e sull'uso di altre fonti di opendata per il processo decisionale basato sui dati a disposizione delle amministrazioni. Mentre l'analisi dei dati rappresenta l'argomento di ricerca, gli opendata sono il dominio di applicazione. Oltre al problema della trasparenza e della responsabilità nei confronti dei cittadini, i dati pubblici accessibili e utilizzabili hanno un grande valore informativo per i processi decisionali della pubblica amministrazione. Le fonti di dati aperte consentono agli amministratori, ai ricercatori e ai professionisti di sviluppare nuove analisi e visualizzazioni in grado di liberare il potenziale informativo nascosto dei dati con miglioramenti significativi della capacità e della qualità informativa. La gestione dei dati e il monitoraggio delle prestazioni sono fondamentali per la Business Intelligence (BI) a supporto dei processi decisionali non solo per le imprese private. Per questo motivo, negli ultimi anni si è assistito ad una progressiva diffusione di iniziative di open data, apertura di database pubblici, hackathon e iniziative relative alla gestione e all’analisi dei dati. Inoltre, la diffusione delle tecnologie e dei sistemi di condivisione dei dati attraverso interfacce strutturate come le API (Application Program Interface) contribuiscono ad aumentare progressivamente la diffusione e l'uso dei dati di politica pubblica. Sulla base di questo framework, la ricerca tenta di validare le seguenti due ipotesi utilizzando gli open data: H1: Le piattaforme di dati aperti possono essere attivate per il supporto alla definizione ed analisi delle politiche e, a differenza delle critiche in letteratura, non vanno considerate archivi di dati creati al solo scopo di soddisfare i requisiti delle agende digitali dei vari paesi; H2: Nell'allocazione degli investimenti in ricerca e innovazione, le regioni hanno sviluppato le strategie di specializzazione intelligente secondo i criteri di coerenza, connettività inter-regionale e connettività intraregionale. Per via delle due ipotesi, il documento si divide in due parti principali: La prima offre una panoramica completa dell'uso di dati aperti a livello dell'Unione europea (UE) per il monitoraggio e la valutazione delle performance della pubblica amministrazione nell’allocazione dei fondi strutturali. Utilizzando gli open data disponibili attraverso i web service della piattaforma ESIF, la ricerca si concentra sullo sviluppo di un approccio più ampio e più profondo all'uso dei dati aperti per un'interpretazione più semplice, più efficace e a più ampio spettro; La seconda parte utilizza ulteriori fonti di dati aperti (documenti pubblici, brevetti RegPat e progetti CORDIS) per supportare lo sviluppo e la valutazione della strategia di specializzazione intelligente, richiesta alle regioni per giustificare gli investimenti in ricerca e sviluppo attraverso i fondi strutturali. La ricerca in questa seconda parte mette in evidenza possibili utilizzi di dati aperti per promuovere la definizione di policy basate sui dati. La prima parte descrive l’architettura di uno strumento web dedicato alla visualizzazione di dati aperti ESIF alimentato da specifiche API. La logica architetturale e l’impostazione del tool si basano su: Principi teorici disponibili in letteratura per un'interpretazione più semplice e più efficace delle informazioni per un pubblico non necessariamente altamente specializzato; Adozione di tecnologie avanzate per produrre una soluzione semplice e flessibile. La seconda parte include la struttura analitica e i risultati empirici basati su diverse fonti di open data opportunamente collegate. I risultati evidenziano quanto le regioni italiane siano state in grado di allocare in modo efficace investimenti regionali cruciali nella R & S pubblica e privata, secondo i criteri previsti dalla strategia di specializzazione intelligente. Tuttavia, nonostante il recente aumento e diffusione della disponibilità di dati aperti e di tecnologie più potenti per sfruttarne potenziale, queste risorse informative presentano diversi limiti. Al di là del volume e della complessità dei dati disponibili, la velocità e l’incertezza delle fonti di dati influiscono sulla loro qualità, accuratezza e completezza. L'informazione pubblica aperta dei fondi strutturali, così come per altri ambiti, manca ancora di granularità e completezza sufficienti a rappresentare un patrimonio informativo completo. Ne deriva una parziale visione delle problematiche, con un potenziale informativo ridotto per i dati disponibili. Ad esempio, i microdati dei singoli progetti finanziati dai fondi strutturali sono riportati solo da alcuni Stati membri, nonostante siano la fonte di informazioni più dettagliata per il problema in esame. Inoltre, l'uso di dati aperti implica spesso l'adozione solo di un'approssimazione delle informazioni necessarie. I dati di progetti di ricerca e di brevetti possono essere utilizzati solamente come proxy del potenziale di innovazione regionale, considerando che la maggior parte delle innovazioni aziendali resta senza traccia, specialmente in molte regioni italiane con una diffusa presenza di piccole e medie imprese. Dati diversi e più specifici dovrebbero essere raccolti e valutati per migliorare il processo decisionale nell’ambito dell’innovazione. Questo problema va oltre lo scopo di questo documento.<br>The research discussed in this thesis is focused on developing and applying new methodologies for collecting, processing and visualizing large sets of open data for public policy performance assessment and decision making. The research focuses on the effectiveness of ESIF and the use of other open data sources for data-driven decision making supporting public managers. While data analytics represents the research topic, public policy open data is the application domain. Beyond the problem of transparency and accountability to citizens, accessible and usable public data have great informative value for public administration decision making. Open sources of data allow administrators, researchers and practitioners to develop new analysis and visualizations that can unleash the hidden informative potential of data with meaningful insights. Data management and performance monitoring are core to business intelligence (BI) informing and supporting decision making, not only for private enterprises. For this reason, recent years have seen increasing numbers of open data initiatives, public database diffusion, open data hackathons and data-related initiatives. Furthermore, the pari passu diffusion of recent technologies and data sharing systems such as application program interfaces (APIs) are also boosting the diffusion and use of public policy data. Based on this framework, using opendata the research attempts to address the following two hypotheses: H1: Open data platforms can be considered useful for policy making and not just as data tombs set up only to satisfy governmental digital agenda requirements. H2: In allocating research and innovation investments, regions have developed their S3s according to embeddedness, relatedness and connectivity. For this reason, this document has two main parts: • The first gives a comprehensive overview of the use of open data at European Union (EU) level for monitoring and performance assessment. Using ESIF open data, the research focuses on developing a wider and deeper approach to the use of open data for simpler and more effective interpretation and insights; • The second part uses additional open data sources (public documents, RegPat patents and CORDIS projects) to assist strategy development and assessment. The research in this second part highlights possible uses of open data to promote data-driven policy making. In the first part is an explanation of a web tool dedicated to visualizing ESIF open data that improves on current methodologies and tools to expose insights based on:  Theoretical principles for simpler and more effective interpretation;  Adoption of advanced technologies for a simple and flexible solution. The second part includes an analytical framework with empirical results. This is based on the capacity of Italian regions to effectively allocate crucial regional investments in public and private R&D, according to the European innovation policy (Smart Specialisation Strategy). Despite the wider diffusion and increase of open data availability and of more powerful technologies to exploit their potential, these resources present many issues (Schintler, 2014). Beyond the volume and the complexity of the available data, the velocity and veracity (uncertainty) of the data sources affect their quality, accuracy and completeness. ESIF open public information still lacks sufficient granularity and completeness to represent a full informative asset. This creates a blurred lens problem, with reduced informative power for the data available. For example, microdata on individual projects financed by ESIF are reported by only a few Member States despite they are the most detailed source of information at the deepest level possible for the problem under analysis. Moreover, using open data often implies adopting only an approximation of the information needed. As with regional investments for innovation policies, patent and research project data are used as a proxy for innovation potential, though most enterprise innovation remains untracked, especially in many Italian regions. Different and more specific data should be collected and assessed to improve decision making regarding such topics. This problem goes beyond the scope of this document.
APA, Harvard, Vancouver, ISO, and other styles
14

Mayne, Matthew. "Development of new software tools for phase equilibria modelling of open systems." Thesis, Lyon, 2018. http://www.theses.fr/2018LYSES038/document.

Full text
Abstract:
Le métamorphisme est un phénomène majeur affectant la distribution des phases minérales au sein de la croûte continentale et participant à sa stabilisation. L’étude des processus métamorphiques est donc essentielle pour comprendre la formation et l’évolution de la Terre. Ces processus exercent un contrôle sur le potentiel de préservation des roches à sa surface et nous renseignent entre autres sur les conditions de pression–température régnant en profondeur. Ils contrôlent également la production et le stockage de fluides au sein de la croûte ce qui influence les cycles géochimiques au sein de la lithosphère, de l’hydrosphère et de l’atmosphère et a, de fait, des implications importantes sur le climat et l’apparition de la vie sur Terre. La principale source de variabilité au sein de ces systèmes correspond à des changements de composition chimique résultant eux-mêmes de transferts de matière. Les techniques modernes de modélisation quantitative des équilibres de phases permettent de calculer l’assemblage minéralogique stable au sein d’un système à l’équilibre pour lequel les paramètres pression, température et composition chimique sont connus. Ceci étant, les programmes informatiques actuels ne possèdent que de fonctionnalités limitées pour modéliser et appréhender les conséquences de changements de composition chimique du système au cours du métamorphisme. Un nouvel outil informatique (Rcrust) a été développé pour permettre de calculer l’assemblage minéralogique stable dans un système soumis à des variations de composition lors de son évolution dans l’espace multidimensionnel pression–température–composition chimique<br>The investigation of metamorphic processes in the Earth’s crust is integral to understanding the formation and evolution of the Earth. These processes control the preservation potential of the geochronological rock record and give us insight into, amongst others, the pressure and temperature conditions of the Earth’s interior. Further, they control fluid generation and consumption within the crust which influences global geochemical cycles within the lithosphere, hydrosphere and atmosphere. This has important implications on the global climate and the creation of conditions conducive to life. The dominant mechanism of change both within and between these systems are compositional changes invoked by processes of mass transfer. Modern quantitative phase equilibrium modelling allows the calculation of the stable phase assemblage of a rock system at equilibrium given its pressure, temperature and bulk chemical composition. However, current software programs have limited functionalities for the sophisticated handling of a changing bulk composition. A new software tool (Rcrust) has been developed that allows the modelling of points in pressure–temperature–bulk composition space in which bulk compositional changes can be passed between points as the system evolves
APA, Harvard, Vancouver, ISO, and other styles
15

Marlow, Gregory. "Week 01, Video 03: Maya UI Manipulation Tools." Digital Commons @ East Tennessee State University, 2020. https://dc.etsu.edu/digital-animation-videos-oer/8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Štolc, Robin. "Porovnání komerčních a open source nástrojů pro testování softwaru." Master's thesis, Vysoká škola ekonomická v Praze, 2010. http://www.nusl.cz/ntk/nusl-75963.

Full text
Abstract:
The subject of this thesis are software testing tools, specifically tools for manual testing, automatic testing, bug tracking and test management. The aim of this thesis is to introduce reader to several testing tools from each category and to compare these tools. This objective relates to the secondary aim of creating set of criteria for testing tools comparison. The contribution of this thesis is the description and comparison of chosen testing tools and the creation of a set of kriteria, that can be used to Compaq any other testing tools. The Thesis is dividend into the four main parts. The first part briefly describes the theoretical foundations of software testing, the second part deals with descriptions of various categories of testing tools and their role in the testing process, the third part defines the method of comparison and comparison kriteria and in the last, fourth, part the selected testing tools are described and compared.
APA, Harvard, Vancouver, ISO, and other styles
17

Luccioni, Carlo. "Open Participation - How online tools could foster user engagement aimed to city development." Thesis, Malmö högskola, Fakulteten för kultur och samhälle (KS), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-22555.

Full text
Abstract:
The thesis research question originated form an interest in experiments with web and social media tools, and explores in what ways these kinds of communicative and collaborative media could facilitate opening dialogue among citizens and various actors present in the area of Malmö.The research approach deemed most suitable for the selected area is a combination of a participatory design methodology and ethnographic research. During the fieldwork phase different use situations were investigated through interviews, surveys and case studies. The purpose of the investigation was mapping the different stakeholders who operate to solve these problems engaging the residents. The modalities of communication of Malmö municipality and its non-profit organizations have been analysed. The results were compared with other researches in the fields.To understand the motives behind the users’ behaviour, the reasons for active and non- active participation have been investigated, especially those related to social media. Between a form of passive and active engagement a different form of engagement has been identified, that could include the mixed user group of individuals who are interested in being gradually involved in volunteering; having different level of indirect engagement can facilitate these users in “taking the first step” to participate.The concept was developed as a Facebook App in collaboration with Frivilligcentra. The app will allow the users to define their own engagement path, dividing every local volunteering event in tasks, with different levels of involvement. Overall it may define a new flexibility in the non-profit sector- in the sense of time, place, diversification of the experience, engagement - which could demonstrate that with the support of online tools organizations and users can interact better.
APA, Harvard, Vancouver, ISO, and other styles
18

Silva, Vítor Emanuel Marta da. "Development of open models and tools for seismic risk assessment: application to Portugal." Doctoral thesis, Universidade de Aveiro, 2013. http://hdl.handle.net/10773/11948.

Full text
Abstract:
Doutoramento em Engenharia Civil<br>The exponential growth of the world population has led to an increase of settlements often located in areas prone to natural disasters, including earthquakes. Consequently, despite the important advances in the field of natural catastrophes modelling and risk mitigation actions, the overall human losses have continued to increase and unprecedented economic losses have been registered. In the research work presented herein, various areas of earthquake engineering and seismology are thoroughly investigated, and a case study application for mainland Portugal is performed. Seismic risk assessment is a critical link in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible numerical tools and software. In the present work, an open-source platform for seismic hazard and risk assessment is developed. This software is capable of computing the distribution of losses or damage for an earthquake scenario (deterministic event-based) or earthquake losses due to all the possible seismic events that might occur within a region for a given interval of time (probabilistic event-based). This effort has been developed following an open and transparent philosophy and therefore, it is available to any individual or institution. The estimation of the seismic risk depends mainly on three components: seismic hazard, exposure and vulnerability. The latter component assumes special importance, as by intervening with appropriate retrofitting solutions, it may be possible to decrease directly the seismic risk. The employment of analytical methodologies is fundamental in the assessment of structural vulnerability, particularly in regions where post-earthquake building damage might not be available. Several common methodologies are investigated, and conclusions are yielded regarding the method that can provide an optimal balance between accuracy and computational effort. In addition, a simplified approach based on the displacement-based earthquake loss assessment (DBELA) is proposed, which allows for the rapid estimation of fragility curves, considering a wide spectrum of uncertainties. A novel vulnerability model for the reinforced concrete building stock in Portugal is proposed in this work, using statistical information collected from hundreds of real buildings. An analytical approach based on nonlinear time history analysis is adopted and the impact of a set of key parameters investigated, including the damage state criteria and the chosen intensity measure type. A comprehensive review of previous studies that contributed to the understanding of the seismic hazard and risk for Portugal is presented. An existing seismic source model was employed with recently proposed attenuation models to calculate probabilistic seismic hazard throughout the territory. The latter results are combined with information from the 2011 Building Census and the aforementioned vulnerability model to estimate economic loss maps for a return period of 475 years. These losses are disaggregated across the different building typologies and conclusions are yielded regarding the type of construction more vulnerable to seismic activity.<br>O contínuo crescimento da população mundial tem levado a uma massificação dos centros urbanos, frequentemente localizados em áreas propensas a desastres naturais, de entre os quais os sismos. Consequentemente, e apesar dos avanços do conhecimento no domínio da modelação de catástrofes naturais e das ações de mitigação do risco, o número de fatalidades continua a aumentar e, recentemente, perdas económicas sem precedentes têm vindo a ser registadas. No presente trabalho, são investigados vários aspetos da engenharia sísmica e sismologia, e é desenvolvido como caso de estudo Portugal continental. A avaliação rigorosa do risco sísmico é um instrumento fundamental para a redução do número de vítimas e de danos como consequência dos eventos sísmicos. Este reconhecimento despoletou o desenvolvimento de ferramentas numéricas e de software para o cálculo do risco. No presente trabalho, uma plataforma open-source para o cálculo de perigosidade e risco sísmico foi desenvolvida, que permite calcular a distribuição das perdas e danos para um cenário específico da ação sísmica (evento determinístico), ou das perdas acumuladas devidas a todos os eventos sísmicos que podem ocorrer numa determinada região e num dado período de tempo. Como resultado deste trabalho foi desenvolvido um software, que é disponibilizado a qualquer indivíduo ou instituição. A determinação do risco sísmico depende principalmente de três componentes: perigosidade sísmica, exposição e vulnerabilidade. A última componente assume particular importância, na medida em que uma eventual intervenção ao nível do reforço estrutural pode ter influência direta na redução do risco sísmico associado. O recurso a metodologias analíticas é fundamental para a avaliação da vulnerabilidade estrutural, particularmente em regiões onde a informação sobre danos em edifícios após sismos é escassa ou inexistente. Neste trabalho foram analisadas várias metodologias conhecidas, discutindo-se a eficiência e rigor dos vários métodos, nomeadamente no respeitante à relação entre precisão e esforço computacional exigido. Complementarmente é proposta uma abordagem simplificada que permite o cálculo expedito de curvas de fragilidade. É ainda proposto um modelo de vulnerabilidade para edifícios de betão armado em Portugal, utilizando dados recolhidos na análise de centenas de projetos de edifícios existentes. Foi adotada uma abordagem analítica, baseada em análises não-lineares dinâmicas, que permitiu avaliar a influência de vários parâmetros, nomeadamente a influência dos critérios na definição do dano ou o tipo de medida de intensidade usada na representação da ação sísmica. É apresentada uma revisão abrangente dos estudos e contributos anteriores de outros autores que contribuíram para a compreensão da perigosidade e risco sísmico em Portugal. Neste estudo, para o cálculo da perigosidade sísmica em Portugal continental foi usado um modelo disponível de zonas sismogénicas, e modelos de atenuação recentemente propostos. Estes resultados foram combinados com dados provenientes do recente Recenseamento Geral, de 2011, e com o modelo de vulnerabilidade desenvolvido neste trabalho, obtendo-se mapas de perdas económicas para um período de retorno da ação de 475 anos. A desagregação para as diferentes tipologias construtivas estudadas permitiu aferir quais os tipos de construção poderão ter maior impacto nas perdas económicas totais num eventual evento sísmico.<br>La crescita esponenziale della popolazione mondiale ha portato a un aumento di insediamenti spesso localizzati in aree propense a disastri naturali, tra cui terremoti. Di conseguenza, nonostante gli importanti avanzamenti nel campo della modellazione delle catastrofi naturali e nelle azioni di mitigazione del rischio, le perdite umane complessive sono continuate a crescere e sono state registrate perdite economiche senza precedenti. Nel lavoro di ricerca presentato di seguito, varie aree dell’ingegneria sismica e della sismologia sono investigate a fondo e applicate come caso studio al Portogallo continentale. La definizione del rischio sismico è un punto critico nella riduzione di vittime e danni dovuti a eventi sismici. Il riconoscimento di tale importanza ha portato a una rapida crescita della richiesta di strumenti e software accurati, affidabili e flessibili. Nel presente lavoro è stata sviluppata una piattaforma open-source per la definizione della pericolosità e del rischio sismico. Questo software è capace di calcolare la distribuzione di perdite o danni per un determinato scenario sismico (evento deterministico) o le perdite dovute a tutti i possibili eventi sismici che potrebbero accadere in una regione in un dato intervallo di tempo. Questo risultato è stato perseguito seguendo una filosofia aperta e trasparente e quindi è disponibile a qualsiasi individuo o istituzione. La stima del rischio sismico dipende soprattutto da tre componenti: la pericolosità sismica, la esposizione e la vulnerabilità. L’ultima componente assume speciale importanza, poiché intervenendo con soluzioni appropriate di adeguamento, è possibile far diminuire direttamente il rischio sismico. L’utilizzo di metodologie analitiche nella definizione della vulnerabilità delle strutture è fondamentale particolarmente in regioni dove potrebbero non essere disponibili informazioni sul danno a edifici conseguente un terremoto. Numerose metodologie tra le più comuni sono state analizzate, e si è concluso su quale metodo può offrire un equilibrio ottimale tra accuratezza dei risultati e impegno computazionale. Inoltre si propone un approccio semplificato a partire dalla metodologia di determinazione delle perdite sismiche basata sugli spostamenti (DBELA), che permette di derivare velocemente curve di fragilità, considerando un ampio spettro di incertezze. Si propone un nuovo modello di vulnerabilità per edifici in cemento armato in Portogallo, usando informazioni statistiche raccolte da centinaia di edifici reali. Si è utilizzato un approccio analitico basato su analisi dinamiche non lineari e si è investigato sull’impatto di un gruppo di parametri chiave, tra cui i criteri di definizione dello stato di danno e il tipo di misura di intensità scelta. É presentata un’ampia revisione di studi precedenti che hanno contribuito alla comprensione della pericolosità sismica e del rischio per il Portogallo. Per calcolare la pericolosità sismica probabilistica sull’intero territorio si è fatto uso di un modello esistente di fonti sismiche con modelli di attenuazione proposti recentemente. Questi ultimi risultati sono stati combinati con informazioni dal Censimento degli Edifici del 2011 e il suddetto modello di vulnerabilità per stimare mappe di perdite economiche per un periodo di ritorno di 475 anni. Le perdite sono state disaggregate per le diverse tipologie di edifici e sono state riportate conclusioni sul tipo di costruzione più vulnerabile all’attività sismica.
APA, Harvard, Vancouver, ISO, and other styles
19

Euler, Elias. "Perspectives on the role of digital tools in students' open-ended physics inquiry." Licentiate thesis, Uppsala universitet, Fysikundervisningens didaktik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-382750.

Full text
Abstract:
In this licentiate thesis, I present detailed case studies of students as they make use of simulated digital learning environments to engage with physics phenomena. In doing so, I reveal the moment-to-moment minutiae of physics students’ open-ended inquiry in the presence of two digital tools, namely the sandbox software Algodoo and the PhET simulation My Solar System (both running on an interactive whiteboard). As this is a topic which has yet to receive significant attention in the physics education research community, I employ an interpretivist, case-oriented methodology to illustrate, build, and refine several theoretical perspectives. Notably, I combine the notion of semi-formalisms with the notion of Newtonian modeling, I illustrate how Algodoo can be seen to function as a Papertian microworld, I meaningfully combine the theoretical perspectives of social semiotics and embodied cognition into a single analytic lens, and I reveal the need for a more nuanced taxonomy of students’ embodiment during physics learning activities. Each of the case studies presented in this thesis makes use of conversation analysis in a fine-grained examination of video-recorded, small-group student interactions. Of particular importance to this process is my attention to students’ non-verbal communication via gestures, gaze, body position, haptic-touch, and interactions with the environment. In this way, I bring into focus the multimodally-rich, often informal interactions of students as they deal with physics content. I make visible the ways in which the students (1) make the conceptual connection between the physical world and the formal/mathematical domain of disciplinary physics, (2) make informal and creative use of mathematical representations, and (3) incorporate their bodies to mechanistically reason about physical phenomena. Across each of the cases presented in this thesis, I show how, while using open-ended software on an interactive whiteboard, students can communicate and reason about physics phenomena in unexpectedly fruitful ways.
APA, Harvard, Vancouver, ISO, and other styles
20

Gatt, Michael. "Tools for understanding electroacoustic music." Thesis, De Montfort University, 2014. http://hdl.handle.net/2086/10754.

Full text
Abstract:
There is an arguable lack of activity and interest in the analysis of electroacoustic music when compared to its composition and performance. The absence of a strong and active analytical community is very concerning, as it should be a fundamental part of any larger musical community that wishes for works to be performed and discussed in later years. The problems that face electroacoustic music analysis are that there is no consensus or single analytical tool/methodology that dictates how such an activity should be undertaken. Rather than attempting to appropriate existing tools meant for traditional musics or create a new universal one this thesis will argue that a new culture should be adopted that promotes different opinions on the subject of electroacoustic music analysis, as opposed to defining a consensus as to how it should be conducted. To achieve this the thesis will: evaluate and critique what constitutes and defines electroacoustic music analysis; provide a general and flexible procedure to conduct an analysis of an electroacoustic work; develop a set of criteria and terms to cross-examine the current analytical tools for electroacoustic music in order to define the gaps in the field and to identify pertinent elements within electroacoustic works; analyse a number of electroacoustic works to test and implement the ideas raised within this thesis; and finally the concept of an analytical community (in which such a culture could exist) is outlined and implemented with the creation of the OREMA (Online Repository for Electroacoustic Music Analysis) project. This universal approach will cover both epistemological and ontological levels of electroacoustic music analysis. All of the concepts raised above are interlinked and follow the main hypothesis of this thesis: • There is no one single analysis that can fully investigate a work; • Analyses are a perspective on a work, ultimately formed through the subjective perception of the analyst; • These perspectives should be shared with other practitioners to help develop a better understanding of the art form. This PhD study was part of the New Multimedia Tools for Electroacoustic Music Analysis project (2010-2013) funded by the Arts and Humanities Research Council (UK). Other outcomes of that project included the various analysis symposiums held at De Montfort University in Leicester and the electroacoustic analysis software EAnalysis created by Pierre Couprie.
APA, Harvard, Vancouver, ISO, and other styles
21

Vitek, Francis. "Monetary policy analysis in a small open economy : development and evaluation of quantitative tools." Thesis, University of British Columbia, 2007. http://hdl.handle.net/2429/31703.

Full text
Abstract:
This doctoral thesis consists of four papers, the unifying theme of which is the development and evaluation of quantitative tools for purposes of monetary policy analysis and inflation targeting in a small open economy. These tools consist of alternative macroeconometric models of small open economies which either provide a quantitative description of the monetary transmission mechanism, or yield a mutually consistent set of indicators of inflationary pressure together with confidence intervals, or both. The models vary considerably with regards to theoretical structure, and are estimated with novel Bayesian procedures. In all cases, parameters and trend components are jointly estimated, conditional on prior information concerning the values of parameters or trend components. The first paper develops and estimates a dynamic stochastic general equilibrium or DSGE model of a small open economy which approximately accounts for the empirical evidence concerning the monetary transmission mechanism, as summarized by impulse response functions derived from an estimated structural vector autoregressive or SVAR model, while dominating that SVAR model in terms of predictive accuracy. The primary contribution of this first paper is the joint modeling of cyclical and trend components as unobserved components while imposing theoretical restrictions derived from the approximate multivariate linear rational expectations representation of a DSGE model. The second paper develops and estimates an unobserved components model for purposes of monetary policy analysis and inflation targeting in a small open economy. The primary contribution of this second paper is the development of a procedure to estimate a linear state space model conditional on prior information concerning the values of unobserved state variables. The third paper develops and estimates a DSGE model of a small open economy for purposes of monetary policy analysis and inflation targeting which provides a quantitative description of the monetary transmission mechanism, yields a mutually consistent set of indicators of inflationary pressure together with confidence intervals, and facilitates the generation of relatively accurate forecasts. The primary contribution of this third paper is the development of a Bayesian procedure to estimate the levels of the flexible price and wage equilibrium components of endogenous variables while imposing relatively weak identifying restrictions on their trend components. The fourth paper evaluates the finite sample properties of the procedure proposed in the third paper for the measurement of the stance of monetary policy in a small open economy with a Monte Carlo experiment. This Bayesian estimation procedure is found to yield reasonably accurate and precise results in samples of currently available size.<br>Arts, Faculty of<br>Vancouver School of Economics<br>Graduate
APA, Harvard, Vancouver, ISO, and other styles
22

Johansson, Jonathan. "A comparison of resource utilization and deployment time for open-source software deployment tools." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-13874.

Full text
Abstract:
The purpose of this study is to compare the software deployment tools Ansible, Chef and SaltStackregarding deployment time and their respective resource utilization, and the findings of this studyare also compared to the previous works of Benson et al. (2016), which also studied deploymenttime. However, there is no previous research which mentions resource utilization, which is equallyimportant. The study consists of an experiment performed in one of the laboratory rooms at theUniversity of Skövde where all three software deployment tools are configured to deploy a setamount of packages to three hosts each.By measuring deployment time with the most stable releases (as of 2017-04-22) for each softwaredeployment tool, as well as resource utilization for each host and server, this study may assist systemadministrators to make more informed decisions when deciding which application to use to managetheir computers and infrastructure.The results of the study show that Chef is the fastest software deployment tool in terms ofdeployment time. Chef is also shown to be the most optimized application, as its usage of resourcesis better than both Ansible and SaltStack.<br>Syftet med denna studie är att studera och jämföra fjärrinstallationsprogrammen Ansible, Chef ochSaltStack gällande den tid det att installera en mängd program på ett antal klienter, och derasrespektive resursutnyttjande. Resultaten från denna studie jämförs även med tidigare studier avBenson et al. (2016), som också studerat tidsåtgången för Ansible, Chef och SaltStack. Det finnsemellertid ingen tidigare forskning som nämner resursutnyttjande, vilket är lika viktigt. Studienbestår av ett experiment som utförs i ett av laboratorierna vid Högskolan i Skövde där alla treprogram konfigureras och användes för att installera en viss mängd paket till tre klientdatorervardera.Genom att mäta tiden det tar för varje program med de senaste stabila utgåvorna (2017-04-22),samt resursutnyttjandet för varje klientdator och server, kan systemadministratörer läsa dennastudie för att fatta mer informerade beslut när de bestämmer vilken applikation som ska användasför att hantera deras datorer och infrastruktur.Resultaten av studien visar att Chef är det snabbaste fjärrinstallationsprogrammet för att installerapaket på klienter. Chef visar sig också vara den mest optimerade applikationen, eftersom dessresursutnyttjande är bättre än både Ansible och SaltStack.
APA, Harvard, Vancouver, ISO, and other styles
23

William, Jeffry Leonardo, and Mochamad Rifky Wijaya. "Open Innovation Strategy: Open platform-based digital mapping; as tools for value creation and value capture : Case study of OpenStreetMap and Google Maps." Thesis, KTH, Industriell Marknadsföring och Entreprenörskap, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-216391.

Full text
Abstract:
Open innovation has been rising in popularity as an alternative to traditional model for organizations to enhance innovation in their products or services. In the past, the innovation processes was time-consuming and costly. It has now become significantly efficient and effective, supported by the advancement of today’s IT such as Internet, Cloud Computing and Big Data. Open innovation has changed the aspect of the innovation source; from closed internal R&amp;D to fully utilization of consumers’ collaboration. Decision to shift towards open innovation strategy has been lying on several areas including motivation, financial direction, and preference of the innovation strategies and business models that fitting the organizational core strategy. This research studied the relation of these areas and its effect; it determined the way IT-organization creates and captures value that were done by opening its product platform. This thesis was conducted to analyze the open innovation approach in an open digital navigation platform, featuring two platforms as case study: Google Maps and OpenStreetMap. The investigation emphasized the utilizing of the open innovation strategy to build its platform where crowdsourcing and open source software as objects highlighted in the research. The data was collected from secondary sources. Research findings suggested that crowdsourcing and open source software strategy are the main strategies of open innovation implemented in IT digital mapping platform to create and capture value. While these strategies have been practiced in both platforms, circumstances (motivation, financial direction, and business strategy) that hovering around the internal aspect of organizations affected the application of those strategies. The implementation results are differ according to preferred business model. The result of this research suggested that a non-profit based organization tends to utilize open innovation to improve the value of their product through consumer collaboration, while a profit based organization adopts open innovation to generate additional pool of revenue through customers’ feedback and input data. The open innovation leads to creation of a new business model as the foundation of innovation.
APA, Harvard, Vancouver, ISO, and other styles
24

Krejčí, Jiří. "Open source nástroje na podporu byznys architektury." Master's thesis, Vysoká škola ekonomická v Praze, 2012. http://www.nusl.cz/ntk/nusl-124697.

Full text
Abstract:
Business architecture is compared to the enterprise architecture discipline still unexplored. While a number of papers and literature on enterprise architecture has been written, business architecture remains behind in this respect. This thesis mainly deals with the creation of a conceptual model of business architecture artifacts and comparing them with open source tools. The primary objective of this study is to compare open source tools to support the business architecture artifacts. To fulfill this primary objective is important secondary objective. The secondary objective is to analyze the most highly cited expert articles and scientific literature and analyzing the most common architectural frameworks in organizations. Based on the analysis of these sources of information create conceptual artifacts BA model with the largest number of their occurrence. These artifacts will be assigned to specific models, which fulfill them. After reaching the secondary target, comes the fulfillment of the primary objectives. Then selected open source tools will be compared with that created a conceptual model and evaluated as a measure of its fulfillment.
APA, Harvard, Vancouver, ISO, and other styles
25

Köksal, Sakip. "Face milling of nickel-based superalloys with coated and uncoated carbide tools." Thesis, Coventry University, 2000. http://curve.coventry.ac.uk/open/items/86a6b065-704a-475b-b805-9d3397487ddf/1.

Full text
Abstract:
Face milling machinability investigation of two difficult-to-machine nickel-based superalloys, namely Inconel 718 and Waspaloy, has been carried out with four different types of tungsten carbide tools under various cutting conditions. The tools comprised of one double-layer CVD-TiCN+Al2O3 coated (KC994M), two PVD-TiN coated (KC720 and KC730) and one uncoated (KMF) tungsten carbide tools. The objectives of the study include investigation of tool performance, failure modes and wear mechanisms under the cutting conditions employed. In addition, surface integrity of the machined surfaces, with regard to surface finish, subsurface microhardness and metallographic examination of the subsurface microstructure, was investigated. CVD-coated KC994M gave the best overall performance in terms of tool life at low and high cutting conditions on both workpieces. The second best-performing tool was the uncoated KMF grade which gave as high tool lives as KC994M at lower cutting speeds. However at higher cutting speeds, KMF was generally outperformed by PVD-TiN coated tools. Short tool lives were obtained at higher cutting speeds of 75 and 100 m/min due to premature failure by chipping. Tool wear at low cutting speed range was due to a combination of progressive microchipping and plucking through a fracture/attrition related wear mechanism associated with cyclic workpiece adhesion and detachment and abrasion/diffusion-related flank wear. Plucking and microchipping were the dominant wear mechanisms. Coating layers on the rake face of both CVD and PVD coated tools were almost completely removed within the first few seconds of cutting at all cutting speeds tested, thus becoming ineffective. On the flank face, however, they remained intact for a longer period and hence increasing tools performance at the medium cutting speed range. Analysis of the subsurface microstructures and microhardness measurements showed that plastic deformation was the predominant effect induced onto the machined surface, the degree of which influenced by the cutting speed, tool wear and prolonged machining. In addition surface irregularities in the form of tearing and embedded hard particles were found to occur which was mainly associated with the chipping dominated wear mode.
APA, Harvard, Vancouver, ISO, and other styles
26

Kučera, Jan. "Výběr a implementace open source nástroje pro řízení portfolia projektů." Master's thesis, Vysoká škola ekonomická v Praze, 2009. http://www.nusl.cz/ntk/nusl-19162.

Full text
Abstract:
Companies and other organizations have to change and adapt their strategies and redefine their goals constantly. Actions and steps that are needed in order to achieve the defined goals and to realize the change are often executed in a form of a project. There is no doubt that projects need to be managed in an appropriate way. In organizations where there are more than one project executed in parallel and other projects are being planned or proposed at the same time it is necessary to manage not only the individual projects but the entire portfolio of projects and programmes. Portfolio management is needed to ensure that the selected projects are best aligned with the defined goals and the strategy but this requires an analysis of a large amount of data about current and upcoming projects or about available resources and their utilization. It is not always easy to get all required data and to perform the analysis. This is the reason why deployment of some project and portfolio management software tool should be considered. This thesis deals with a search for suitable open source project and portfolio management tools and with assessment of costs and benefits of deployment of selected open source tool in a hypothetical IT company. The goal of this thesis is to define the term project and portfolio management tool and to find at least five open source tools of this type that are suitable for deployment in an organization. At least one of these tools should be suitable for deployment in the IT company. Another goal is comparison of features and functionality provided by the open source tools with functionality provided by the robust proprietary project and portfolio management solution deployed on premise and with one solution offered in the Software-as-a-Service model. Creation of a business case dealing with implementation of the selected open source tool is the last goal of this thesis. Identification and description of available open source tools and comparison of these tools with representatives of the robust project and portfolio management tools developed and distributed in models different from open source are considered the main benefits of this thesis. The terms "open source software" and "project and portfolio management tool" are defined in the first part of this work. The definition of terms is followed by the definition of the overall approach to the assessment of the open source tools which involves definition of assessment criteria and obligatory requirements. Next part of the work is dedicated to the selection of the open source tools that are suitable for deployment in an organization and which represent candidates for project and portfolio management tools. Assessment of these tools using the defined criteria is performed as the next step which is followed by the comparison with representatives of proprietary project and portfolio management tools. Last part of the work is dedicated to the business case which deals with deployment of the selected open source tool. This work concludes with discussion whether the defined goals were met or not, and with the summary of the results.
APA, Harvard, Vancouver, ISO, and other styles
27

Shikur, H. (Henok). "Assessing modeling and visualization capabilities of modeling tools:limitations and gaps of the open source modeling tools." Master's thesis, University of Oulu, 2015. http://urn.fi/URN:NBN:fi:oulu-201502111072.

Full text
Abstract:
Due to the increasing number of Information Communication Technology (ICT) environments, security is becoming a concern for many researchers and organisations. Organisations have implemented different security measures to protect their assets. Different industries—such as power plants and water, oil, and gas utilities—are adapting different network modelling tools for guarding their assets and are preparing for incidents that might occur in the future. Modelling tools are very important for the visualisation of computer networks. There are currently many modelling tools with different modelling and visualisation capabilities for computer networks. The aim of this research is to make a thorough assessment of the different modelling tools’ capabilities of modelling computer networks and visualising computer network communication. Furthermore, it hopes to show areas for improvement in order to increase the quality of modelling tools based on industry requirements. The research methodology of this research takes the form of a case study. First, the study analyses previous research in order to illustrate gaps in the literature, as well as identifying the strengths and weaknesses of existing network modelling tools. The empirical part of the research includes first, studying and evaluating seven open-source modelling tools based on different types of capabilities, this may limit the generalisability of the findings to some extent; and second, selecting four modelling tools for further study. Once four modelling tools were evaluated based on literature reviews and the requirements set in this study, the top two open-source (OSS) modelling tool packages were selected, downloaded, installed, and evaluated further. The criteria set to evaluate the four modelling tools in this research are based on the requirements provided by the European company nSense, which provides different vulnerability assessments, security consulting, and training, and the existing literature. The evaluation of the tools resulted in the screens that were copied and presented in this document for verification. Finally, the one tool which was the most suitable for further studies, and which fulfilled most of the requirements set in this research, was recommended for further research. In total, four modelling tools were chosen for the evaluation, using different literature reviews based on the requirements (see Appendix A) in this research. The results showed that the two top modelling tools were OMNeT++ and IMUNES. After practical analysis of these tools, OMNeT++ was found to be the best tool based on the aims and requirements of this research. Further, the study found that usability problems played a large part in evaluating different modelling tools, which might have changed the outcomes of the result. It can therefore be concluded that this type of evaluation is highly dependent on the evaluator’s knowledge and skill, as well as the usability of the tool.
APA, Harvard, Vancouver, ISO, and other styles
28

Hitchcock, Jonathan. "Decorating Asterisk : experiments in service creation for a multi-protocol telephony environment using open source tools." Thesis, Rhodes University, 2006. http://hdl.handle.net/10962/d1006539.

Full text
Abstract:
As Voice over IP becomes more prevalent, value-adds to the service will become ubiquitous. Voice over IP (VoIP) is no longer a single service application, but an array of marketable services of increasing depth, which are moving into the non-desktop market. In addition, as the range of devices being generally used increases, it will become necessary for all services, including VoIP services, to be accessible from multiple platforms and through varied interfaces. With the recent introduction and growth of the open source software PBX system named Asterisk, the possibility of achieving these goals has become more concrete. In addition to Asterisk, a number of open source systems are being developed which facilitate the development of systems that interoperate over a wide variety of platforms and through multiple interfaces. This thesis investigates Asterisk in terms of its viability to provide the depth of services that will be required in a VoIP environment, as well as a number of other open source systems in terms of what they can offer such a system. In addition, it investigates whether these services can be made available on different devices. Using various systems built as a proof-of-concept, this thesis shows that Asterisk, in conjunction with various other open source projects, such as the Twisted framework provides a concrete tool which can be used to realise flexible and protocol independent telephony solutions for a small to medium enterprise.
APA, Harvard, Vancouver, ISO, and other styles
29

López, Massaguer Oriol 1972. "Development of informatic tools for extracting biomedical data from open and propietary data sources with predictive purposes." Doctoral thesis, Universitat Pompeu Fabra, 2017. http://hdl.handle.net/10803/471540.

Full text
Abstract:
Hem desenvolupat noves eines de software per tal d’obtenir informació de fonts publiques i privades per tal de desenvolupar models de toxicitat in silico. La primera eina es Collector, una aplicació de programari lliure que genera series de compostos preparats per fer modelat QSAR anotats amb bioactivitats extretes de la plataforma Open PHACTS usant tecnologies de la web semàntica. Collector ha estat utilitzada dins el projecte eTOX per desenvolupar models predictius sobre endpoints de toxicitat. Addicionalment hem concebut, desenvolupat i implementat un mètode per derivar scorings de toxicitat apropiats per modelatge predictiu que utilitza les dades obtingudes de informes d’estudis amb dosis repetides in vivo de la industria farmacèutica. El nostre mètode ha estat testejant aplicant-lo al modelat de hepatotoxicitat obtenint les dades corresponents per 3 endpoints: ‘degenerative lesions’, ‘inflammatory liver changes’ and ‘non-neoplasic proliferative lesions’. S’ha validat la idoneïtat d’aquestes dades obtingudes comparant-les amb els valors de point of departure obtinguts experimentalment i també desenvolupant models QSAR de prova obtenint resultats acceptables. El nostre mètode es basa en la inferència basada en ontologies per extreure informació de la nostra base de dades on tenim dades anotades basades en ontologies. El nostre mètode també es pot aplicar a altres bases de dades amb informació preclínica per generar scorings de toxicitat. Addicionalment el nostre mètode d’inferència basat en ontologies es pot aplicar a d’altre bases de dades relacionals anotades amb ontologies.<br>We developed new software tools to obtain information from public and private data sources to develop in silico toxicity models. The first of these tools is Collector, an Open Source application that generates “QSAR-ready” series of compounds annotated with bioactivities, extracting the data from the Open PHACTS platform using semantic web technologies. Collector was applied in the framework of the eTOX project to develop predictive models for toxicity endpoints. Additionally, we conceived, designed, implemented and tested a method to derive toxicity scorings suitable for predictive modelling starting from in vivo preclinical repeated-dose studies generated by the pharmaceutical industry. This approach was tested by generating scorings for three hepatotoxicity endpoints: ‘degenerative lesions’, ‘inflammatory liver changes’ and ‘non-neoplasic proliferative lesions’. The suitability of these scores was tested by comparing them with experimentally obtained point of departure doses as well as by developing tentative QSAR models, obtaining acceptable results. Our method relies on ontology-based inference to extract information from our ontology annotated data stored in a relational database. Our method, as a whole, can be applied to other preclinical toxicity databases to generate toxicity scorings. Moreover, the ontology-based inference method on its own is applicable to any relational databases annotated with ontologies.
APA, Harvard, Vancouver, ISO, and other styles
30

Bednář, Jan. "Srovnání komerčních BI reportovacích nástrojů s nástroji Open Source." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-165134.

Full text
Abstract:
Diploma thesis deals with comparison of commercial and Open Source Business Intelligence (BI) reporting tools. The main aim of the thesis is to provide a list of BI reporting tools and their further comparison. Evaluation is based on a set of criteria. Every criterium has its assigned value that represents the importance of the criterium in a given group. The same procedure is applied on the groups. The final evaluation is based on defined values of these groups. The output of the thesis is a table structured into five parts according to defined groups of criteria. The second part of the thesis walks the reader through the practical showcase of implementation one of the selected tools that is SAP Business Objects Enterprise 4.0. At the beginning, there is a description of report proposal that contains graphical design and functionality requirements. Next part shows the whole process in detail.
APA, Harvard, Vancouver, ISO, and other styles
31

Márki, András. "Evaluation on how suitable open source tools are for model to model transformation : An industrial point of view." Thesis, Högskolan i Skövde, Institutionen för kommunikation och information, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-8265.

Full text
Abstract:
Model-Driven Development can improve the development process, but it needs tools for model transformation. For industrial companies, the most important aspect is that the transformation tools should scale well, so that they can be used with huge models. There are some open-source model transformation tools on the market, and this report aims to investigate the scalability of open source tools for model transformation. For the investigation, Eclipse Modeling Framework is used. This report identifies four open-source model transformation tools (ATL, QVT Operational, QVT Declarative, SmartQVT) and identifies the variables needed for a tool to be evaluated within the bounds of an experiment. The only tool which could be benchmarked was ATL, which scaled linearly in both terms of transformation time and memory consumption.
APA, Harvard, Vancouver, ISO, and other styles
32

Diederich, Benedict [Verfasser], Rainer [Gutachter] Heintzmann, and Christian [Gutachter] Eggeling. "Democratizing microscopy by introducing innovative Open-Source hard and software tools / Benedict Diederich ; Gutachter: Rainer Heintzmann, Christian Eggeling." Jena : Friedrich-Schiller-Universität Jena, 2021. http://d-nb.info/1239177542/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Tlale, Moretlo Celia. "Real-time, open controller for reconfigurable manufacturing systems." Thesis, [Bloemfontein?] : Central University of Technology, Free State, 2013. http://hdl.handle.net/11462/196.

Full text
Abstract:
Thesis (M. Tech. (Information Technology)) -- Central University of technology, Free State, 2013<br>Markets for manufactured products are characterized by a fragmentation of the market (with regards to size and time), and by shorter product cycles. This is due to the occurrence of mass customization and globalization. In mass customization, the same basic products are manufactured for a broad market, but then consumers are given the liberty to choose the “finishing touches” that go with the product. The areas that manufacturers now compete for are higher quality products, low cost and timely response to market changes. Appropriate business strategies and manufacturing technologies must thus be used to implement these strategic dimensions. The paradigm of Reconfigurable Manufacturing System (RMS) has been introduced to respond to this new market oriented manufacturing environment. The design of RMS allows ease of reconfiguration as it has a modular structure in terms of software and hardware. This allows ease of reconfiguration as a strategy to adapt to changing market demands. Modularity will allow the ability to integrate/remove software/hardware modules without affecting the rest of the system. RMS can therefore be quickly reconfigured according to the production requirements of new models, it can be quickly adjusted to exact capacity requirements as the market grows and products change, and it is able to integrate new technology. In this research project, real-time, open controller is designed and developed for Reconfigurable Manufacturing Tools (RMTs). RMTs are the basic building blocks for RMS. Real time and openness of the controllers for RMT would allow firstly, for the modular design of RMTs (so that RMTs can be adapted easily for changing product demands) and secondly, prompt control of RMT for diagnosability.
APA, Harvard, Vancouver, ISO, and other styles
34

Pillac, Victor. "Dynamic vehicle routing : solution methods and computational tools." Phd thesis, Ecole des Mines de Nantes, 2012. http://tel.archives-ouvertes.fr/tel-00742706.

Full text
Abstract:
Within the wide scope of logistics management,transportation plays a central role and is a crucialactivity in both production and service industry.Among others, it allows for the timely distributionof goods and services between suppliers, productionunits, warehouses, retailers, and final customers.More specifically, Vehicle Routing Problems(VRPs) deal with the design of a set of minimal costroutes that serve the demand for goods orservices of a set of geographically spread customers,satisfying a group of operational constraints.While it was traditionally a static problem, recenttechnological advances provide organizations withthe right tools to manage their vehicle fleet in realtime. Nonetheless, these new technologies alsointroduce more complexity in fleet managementtasks, unveiling the need for decision support systemsdedicated to dynamic vehicle routing. In thiscontext, the contributions of this Ph.D. thesis arethreefold : (i) it presents a comprehensive reviewof the literature on dynamic vehicle routing ; (ii)it introduces flexible optimization frameworks thatcan cope with a wide variety of dynamic vehiclerouting problems ; (iii) it defines a new vehicle routingproblem with numerous applications.
APA, Harvard, Vancouver, ISO, and other styles
35

Paraskevopoulou, Maria. "Towards a polyocular model for the analysis of music for moving image : an open list of tools and strategies." Thesis, University of Bristol, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.432328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Dlodlo, Joseph Bhekizwe. "Enterprise resource planning in manufacturing SMEs in the Vaal Triangle / Dlodlo J.B." Thesis, North-West University, 2011. http://hdl.handle.net/10394/7355.

Full text
Abstract:
The adoption of Enterprise Resource Planning tools has improved business processes in organisations. This increase has, however come with challenges for the small and medium business sector. First the adoption and deployment of proprietary ERP comes at great cost for organisations whilst it is also difficult for the organisations to ensure that scalability is introduced due to the dynamic change in the SME sector. The adoption and use of open source ERP tools then presents an opportunity for the SME sector. The usage of open source software has increased over the years. This increase has also extended to open source ERP tools. These tools offer the same functions as the proprietary at a fraction of the cost. Despite the benefits that open source ERP offers diffusion of this technology into the SME sector in South Africa has been minimal. This means the SME sector in South Africa is not benefiting from the widely available cost effective open source ERP available in the market. An opportunity therefore exists for them to utilise the technology to gain competitive advantage. The research was done primarily to determine the open source ERP adoption patterns of SMEs in the Vaal region. The research sought to determine the drivers for ERP adoption and barriers to adoption. Focus was specifically extended to investigating knowledge on open source alternatives. The results from the research indicate that the SMEs understand the benefits of adopting ERP for their businesses. The research further revealed that the adoption factors identified in the literature are still relevant in South Africa. The factors identified included costs, lack of training, lack of support, lack of knowledge and the lack of open source vendors. The empirical research also identified that the SME still harbour fear about open source ERP tools due to lack of training ,support and knowledge .From the study it is clear that a new approach needs to be taken to encourage adoption of open source ERP. These approaches include giving incentives to SMEs to adopt open source ERP. These incentives may be in the form of training packages and skills workshops to help overcome the barriers and improve implementation of open source ERP. The research identified the critical need for increased formal education and training in open source software development and emphasis on ERP tools. Government research bodies need to play a role in this area. For an example, there is need for SME and university collaborations in open source ERP deployment, whereby the latter needs to include software development tools in their curricula for SMEs so as to increase awareness. Clear–cut comparisons of existing proprietary systems against open source systems by focusing on functional and technological requirements need to be undertaken. This will help to reflect a cost benefit analysis as well as interoperability between the open source ERP and existing systems. During the adoption process, SME are encouraged to go through the full life cycle of open source ERP adoption. In this cycle the SME needs to do a thorough analysis regarding selection, implementation, integration, migration, training and evaluation of installed tools. That way they will gain the competitive advantage afforded by the tools.<br>Thesis (M.B.A.)--North-West University, Potchefstroom Campus, 2012.
APA, Harvard, Vancouver, ISO, and other styles
37

KHROMOVA, ANNA. "Are the Urban Parks becoming Cyberparks? The Developing of Public Open Spaces: ICT tools to support the landscape planning process." Doctoral thesis, Università Politecnica delle Marche, 2018. http://hdl.handle.net/11566/253133.

Full text
Abstract:
Il lavoro presentato in questa tesi riguarda lo studio, l'analisi e lo sviluppo dei Cyberspaces partendo da un punto di vista teorico per poi passare alla sua implementazione diretta, sottolineando anche il miglioramento che un concetto di Cyberpark pu portare all'umanit. Da un punto di vista teorico, il lavoro svolto stato quello di studiare l'evoluzione storica dalle città intelligenti a quelle sensibili, con l'obiettivo di comprendere le dinamiche che hanno reso possibile l'aggiornamento da un modello classico a un modello di pianificazione urbana. Tale sforzo stato necessario per capire come tali modelli possono essere applicati al reale ambiente urbano che siamo chiamati a trattare, cioè quali sono gli strumenti (teorici) che possiamo usare per interpretarlo correttamente. In seguito, diversi casi di studio sono stati implementati in scenari urbani re- ali adottando soluzioni ICT in grado di fornire informazioni agli utenti e, allo stesso tempo, raccogliere le informazioni dell'utente da analizzare dai pianificatori. Soprattutto, tre casi principali sono stati di particolare importanza nella nostra ricerca, ognuno con uno scopo specifico ma tutti mirati a uno comune: la comprensione di come gli strumenti ITC possono aggiornare un POS alla dimensione di un Cyberspace. Entrando nel dettaglio, il primo caso di studio stato l'implementazione di un'esperienza AR per rendere gli utenti pi consapevoli di ci che li circonda. Il secondo mirava a confrontare gli approcci classici di analisi dei comportamenti spaziali umani a una piattaforma di raccolta dati innovativa finalizzata allo stesso compito. La terza e l'ultima 1 hanno riguardato l'implementazione di un'architettura completa basata su dispositivi mobili nel contesto di un parco urbano, in grado di fornire in- formazioni contestuali agli utenti e, allo stesso tempo, di raccogliere i loro dati. L'esito di quest'ultimo caso di studio stato quindi analizzato e visualizzato mediante un GIS dedicato, che si dimostrato la soluzione pi preziosa a disposizione dell'architetto del paesaggio per migliorare il processo di progettazione di un POS.<br>Nowadays, given the growing complexity of the city systems, the quality of human life depends considerably on the quality of the urban environment. One of the core components of the cities are the POS, that should be more accessible to all on equal terms, by creating a synergistic balance between grey, green and social components. Technology can play a pivotal role for this ambitious task, since it already generates interest and attraction for many people in their everyday life. However, even if there are relevant cases where the adoption of ICT made the use of POS more smart, the recognition of possible interrelation and added values is still not very elaborated so far. There is thus the need to upgrade POSs to the quality of Cyberspaces, defining them as particular outdoor places where the use of ICT creates a synergy between humans and the environment. This is the logical step forward of the progress that we can see in our every day life, since these new technologies have a twofold contribution: in one hand, they provide users with new instruments to enhance their experience of visiting and discovering a place. On the other, they represent a valuable source of information that should be used for the planning process. Hence, this innovation can bring landscape planners to take into account new forms of interactions, generating an endless process where the user become himself the planner. The reason of this relies on one fundamental thing that must be underlined: the modern technology used in the method we will propose creates synergy between the two categories of users and planners, joining them in a unique entity and so allowing the users to be involved in the planning process (which is one of the Cyberpark concepts). In this light, there is the need to understand what can be the real contribution of the ICTs in the planning process, and this will be achieved by performing a comparison between an approach that includes them and an approach, that we can define as a classic one, which doesn't. In addition,also the way users experience the space will be taken into account. The innovation is offered by the possibility to collect a great amount of data in the same period of time and in a smarter way: the citizens are involved in the urban planning project by the mean of a mobile application, in which they can give real-time feedbacks to the planners. What is relevant for the planners is to analyse the human spatial behaviour, which is essential for a good understanding of multiple aspects (political, economic, tourist, social and cultural) of a modern city. Given the above, in line with recent research trends, the work presented in this thesis deals with the study, analysis and development of Cyberspaces by starting from a theoretical point of view for then passing to its direct implementation, also highlighting the enhancement that a Cyberpark concept can bring to the mankind. From a theoretical perspective, the work performed was to study the historical evolution from smart cities to senseable ones, having the aim of understanding the dynamics that made possible the upgrade from a classical to a Cyber model of city planning process. Such effort was necessary in order to understand how the given models can be applied to the real urban environment we're called to deal, i.e. what are the (theoretical) tools we can use to correctly interpret it. Afterwards, different cases of study have been implemented in real urban scenarios by adopting ICT solutions able to provide information to the users and, at the same time, to collect user's information to be analysed from the planners. Especially, three main cases were of particular importance in our research, each one having a specific purpose but all aimed to a common one: the understanding of how ICT tools can upgrade a POS to the dimension of a Cyberspace. Going into detail, the first case of study was the implementation of an AR experience to make users more aware of their surroundings. The second one was aimed at comparing classical approaches of analysing human spatial behaviours to an innovative data collection platform aimed to the same task. The third and last dealt with the implementation of a complete mobile based architecture in the context of a urban park, able to provide contextual information to the users and, at the same time, to collect their data. The outcome of this last case of study was then analysed and visualized by means of a dedicated GIS, which proved to be the most valuable solution at the disposal of landscape architect to enhance the design process of a POS. The research is backed by the framework of CyberParks project - Fostering knowledge about the relationship between Information and Communication Technologies and Public Spaces, a multidisciplinary European COST action that, among others, discusses various ICT tools and presents a methodology that can be adopted for the future development of cyberspaces, useful for both planners and users. The main contribution and novelties of this research thesis can be in conclusion summarised in the following items: i) outlining a step forwards in the research domain of smart cities; ii) renovating the landscape design process by adopting the theoretical approach called the scientific rationalities; iii) the definition of a well established pipeline of work, applied in real case studies and iv) the analysis, by using innovative tools, of the human spatial behaviour according to the landscape perspective, based on real data. With the proposed approach, we make a step forward, making the planning process based on users' needs: "Through smartphones the cities are now burgeoning and unfolding inside every pocket. Every citizen has a tool in which to perceive and process the city peering through his digital lens is an intensely personal experience."
APA, Harvard, Vancouver, ISO, and other styles
38

Baldini, Jacopo. "New visualization tools for sciences and humanities: databases and virtual reality." Doctoral thesis, Scuola Normale Superiore, 2018. http://hdl.handle.net/11384/85816.

Full text
Abstract:
Within the scientific view, one of the major problems related to simulations developed for Virtual Reality, more specifically in research, is mainly related to the poor development of interaction between the data being displayed in the database in which they are contained, combined with the difficulty of making everything directly accessible in virtual environment. In Digital Humanities, it becomes increasingly necessary to have an instrument capable of combining scientific visualization in Virtual Reality with the access and consultation of an open repository. The work done for this thesis is based on the creation of a heterogeneous relational database called ArcheoDB, run by an open web platform, on which researchers can upload 3D content and share them with their own collaborators. The web platform and the database are alongside an application called "ArcheoDB VR Toolkit" developed with Unity Engine, which dynamically loads contents within the ArcheoDB database and visualizes them in a complex scene. This application also provide a detailed reconstruction of the various objects, the background of the discovery and the chemical analyzes made in a highly immersive way, thanks to the use of immersive visualization. Likewise, in order for the application to be 3truly innovative, it must allow not only to display 3D data, but also to interact in real time with the displayed models, enabling to enrich and modify them, based on data consultation and metadata drawn from the database.
APA, Harvard, Vancouver, ISO, and other styles
39

Abd, Rahman M. N. "Modelling of physical vapour deposition (PVD) process on cutting tool using response surface methodology (RSM)." Thesis, Coventry University, 2009. http://curve.coventry.ac.uk/open/items/cca436cf-b72b-c899-ef02-bd522b0d7ec5/1.

Full text
Abstract:
The Physical Vapour Deposition (PVD) magnetron sputtering process is one of the widely used techniques for depositing thin film coatings on substrates for various applications such as integrated circuit fabrication, decorative coatings, and hard coatings for tooling. In the area of coatings on cutting tools, tool life can be improved drastically with the application of hard coatings. Application of coatings on cutting tools for various machining techniques, such as continuous and interrupted cutting, requires different coating characteristics, these being highly dependent on the process parameters under which they were formed. To efficiently optimise and customise the deposited coating characteristics, PVD process modelling using RSM methodology was proposed. The aim of this research is to develop a PVD magnetron sputtering process model which can predict the relationship between the process input parameters and resultant coating characteristics and performance. Response Surface Methodology (RSM) was used, this being one of the most practical and cost effective techniques to develop a process model. Even though RSM has been used for the optimisation of the sputtering process, published RSM modelling work on the application of hard coating process on cutting tool is lacking. This research investigated the deposition of TiAlN coatings onto tungsten carbide cutting tool inserts using PVD magnetron sputtering process. The input parameters evaluated were substrate temperature, substrate bias voltage, and sputtering power; the out put responses being coating hardness, coating roughness, and flank wear (coating performance). In addition to that, coating microstructures were investigated to explain the behaviour of the developed model. Coating microstructural phenomena assessed were; crystallite grain size, XRD peak intensity ratio I111/I200 and atomic number percentage ratio of Al/Ti. Design Expert 7.0.3 software was used for the RSM analysis. Three process models (hardness, roughness, performance) were successfully developed and validated. The modelling validation runs were within the 90% prediction interval of the developed models and their residual errors compared to the predicted values were less than 10%. The models were also qualitatively validated by justifying the behaviour of the output responses (hardness, roughness, and flank wear) and microstructures (Al/Ti ratio, crystallographic peak ratio I111/1200, and grain size) with respect to the variation of the input variables based on the published work by researchers and practitioners in this field. The significant parameters that influenced the coating hardness, roughness, and performance (flank wear) were also identified. Coating hardness was influenced by the substrate bias voltage, sputtering power, and substrate temperature; coating roughness was influenced by sputtering power and substrate bias; and coating performance was influenced by substrate bias. The analysis also discovered that there was a significant interaction between the substrate temperature and the sputtering power which significantly influenced coating hardness, roughness, and performance; this interaction phenomenon has not been reported in previously published literature. The correlation study between coating characteristics, microstructures and the coating performance (flank wear) suggested that the coating performance correlated most significantly to the coating hardness with Pearson coefficient of determination value (R2) of 0.7311. The study also suggested some correlation between coating performance with atomic percentage ratio of Al/Ti and grain size with R2 value of 0.4762 and 0.4109 respectively.
APA, Harvard, Vancouver, ISO, and other styles
40

Koenitz, Hartmut. "Reframing interactive digital narrative: toward an inclusive open-ended iterative process for research and practice." Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/34791.

Full text
Abstract:
In more than two decades of research and practical experiments in interactive digital narrative (IDN), much insight about the relationship of narrative and digital media has been gained and many successful experiments have been undertaken, as a survey of the field illustrates. However, current approaches also limit the scope of experimentation and constrain theory in interactive narrative forms original to digital media. After reviewing the "interactivisation" of legacy theory (neo-Aristotelian poetics for interactive drama, poststructuralism for hyperfiction, 20th century narratology for interactive fiction and as a general theory for IDN), the thesis introduces a theoretical framework that changes the focus from the product-centered view of legacy media towards system and the process of instantiation. The terms protostory describing the overall space of potential narratives in an IDN system, narrative design for the concrete assemblage of elements and narrative vectors as substructures that enable authorial control are introduced to supersede legacy terms like story and plot. On the practical side, the thesis identifies limitations of existing approaches (e.g. legacy metaphors like the timeline, and authoring tools that support only particular traditions) To overcome these limitations a software toolset built on the principles of robustness, modularity, and extensibility is introduced and some early results are evaluated. Finally, the thesis proposes an inclusive, open-ended iterative process as a structure for future IDN research in which practical implementations and research co-exist in a tightly coupled mutual relationship that allows changes on one side to be integrated on the other.
APA, Harvard, Vancouver, ISO, and other styles
41

WILLIAM, JEFFRY LEONARDO, and MOCHAMAD RIFKY WIJAYA. "Open Innovation Strategy:Open platform-based digital mapping; as tools for value creationand value capture - case study of OpenStreetMap and Google maps." Thesis, KTH, Skolan för industriell teknik och management (ITM), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-224843.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Khan, Asif [Verfasser], Christian [Akademischer Betreuer] Niemann-Delius, and Marco [Akademischer Betreuer] Lübbecke. "Development of new metaheuristic tools for long term production scheduling of open pit mines / Asif Khan ; Christian Niemann-Delius, Marco Lübbecke." Aachen : Universitätsbibliothek der RWTH Aachen, 2016. http://d-nb.info/1129180808/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Baiao, Manuel Mazanga. "Open source verktygs stöd för centrala egenskaper hos Business Process Management (BPM) system." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-15732.

Full text
Abstract:
I en komplex informationsteknologisk värld behöver företag och organisationer flexibla affärssystem som följer den tekniska utvecklingen. Av denna anledning har kommersiella BPMS-verktyg (Business Process Management System) blivit populära på grund av sin höga förmåga att snabbt integreras med affärssystemen i nya arbetsmiljöer. Ett alternativ till kommersiella BPMS-verktyg är de OSS-baserade (Open Source Software) BPMS-verktygen med öppen källkod som är tillgänglig för allmänheten att vidareutveckla.  På grund av att de OSS-baserade BPMS-verktygen ofta är unga på marknaden och behöver vidareutvecklas blev syftet med denna uppsats att genomföra en granskning av de open source-baserade BPMS-verktygen vad gäller deras skilda egenskaper. En sådan granskning ökar förståelsen för hur verktygen fungerar och underlättar därför vidareutvecklingen av dem. Genom att utgå från en etablerad analysmodell av Delgado et al. (2015) genomfördes en tematisk innehållsanalys för att jämföra egenskaperna i dessa verktyg. Närmare bestämt analyserades insamlade dokument som beskriver BPMS-verktygens egenskaper. Den vägen jämfördes 6 olika BPMS-verktyg utifrån 13 egenskaper. Analysresultatet visade att Bonitasoft, jBPM och JMPM5 var de mest välutvecklade BPMS-verktygen som gav stöd för flest egenskaper. Resultatet visade även att verktyg med färre egenskaper ändå kunde vara ett bra alternativ för vissa företag, beroende på vilka egenskaper och funktionaliteter som prioriteras inom företaget.<br>In a world of developing complex information technologies, companies and organizations need flexible and updated business systems that and match these technological developments. For this reason, commercial BPMS (Business Process Management System) tools have become popular because of their high ability to quickly integrate with business systems in new work environments. An alternative to commercial BPMS tools are the OSS-based (Open Source Software) open source BPMS tools that are available for the public to develop. Since the OSS-based BPMS tools are often new on the market and need further development, the purpose of this paper was to conduct an analysis of open source-based BPMS tools with respect to their various features. Such an analysis increases the understanding of the tools’ functionality and therefore facilitates further development of them. Based on an established analysis model by Delgado et al. (2015), a thematic option for some companies, depending on the characteristics and functionalities that are prioritized within the company. Content analysis was conducted to compare the characteristics of these tools. More specifically, collected documents were analyzed describing the properties of BPMS tools. That way, 6 different BPMS tools were compared based on 13 properties. The analysis showed that Bonitasoft, jBPM and JBPM5 were the most well-developed BPMS tools that supported most of the features. In addition, the analysis demonstrated that tools with fewer features could still be a good option for some companies, depending on the characteristics and functionalities that are prioritized within the company.
APA, Harvard, Vancouver, ISO, and other styles
44

Bjuhr, Katarina, and Niklas Dahl. "Öppen Innovation : En kvalitativ studie om idétransformation inom företag." Thesis, Umeå universitet, Institutionen för informatik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-121075.

Full text
Abstract:
The concept of open innovation is about outsourcing part of the innovation process to external actors and to make use of other resources and knowledge than your own. This is one possible way to meet the challenges that today’s competitive and fast market places on companies. Open innovation has created conditions for open events such as hackathons and through these kind of events, ideas can be generated that companies can use in their business. Outside-in is a perspective of open innovation as a means to integrate external knowledge and ideas into its own operations, and this study takes place in the context of open innovation with an outside-in perspective. The study aims to find how the methods, techniques and tools affects the transition from an idea to an internal project, which results in new products and services. The study is qualitative and data has been collected through interviews with companies which have extensive knowledge of working with innovation. The result of the study shows there is a gap in the transformation of ideas and internal projects, and that there are no systematic approach to the methods, techniques and tools used to facilitate the transformation.
APA, Harvard, Vancouver, ISO, and other styles
45

QAZIZADA, RASHED. "Enabling Java Software Developers to use ATCG tools by demonstrating the tools that exist today, their usefulness, and effectiveness." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-107316.

Full text
Abstract:
The software industry is expanding at a rapid rate. To keep up with the fast-growing and ever-changing technologies, it has become necessary to produce high-quality software in a short time and at an affordable cost. This research aims to demonstrate to Java developers the use of Automated Test Case Generation (ATCG) tools by presenting the tools that exist today, their usefulness, and their effectiveness. The main focus is on the automated testing tools for the Java industry, which can help developers achieve their goals faster and make better software. Moreover, the discussion covers the availability, features, prerequisites, effectiveness, and limitations of the automated testing tools. Among these tools, the most widely used are Evosuite, JUnit, TestNG, and Selenium. Each tool has its advantages and purpose. Furthermore, these ATCG-tools were compared to provide a clear picture to Java developers, answer the research questions, and show strengths and limitations of each selected tool. Results show that there is no single ultimate tool that can do all kinds of testing independently. It all depends on what the developer aims to achieve. If one tool is good at generating unit test cases for Java classes, another tool is good at testing the code security through penetration testing. Therefore, the Java developers may choose a tool/s based on their requirements. This study has revealed captivating findings regarding the ATCG-tools, which ought to be explored in the future.
APA, Harvard, Vancouver, ISO, and other styles
46

Zingoni, Jacopo. "Semantic Enrichment of Scientific Documents with Semantic Lenses – Developing methodologies, tools and prototypes for their concrete use." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amslaurea.unibo.it/4476/.

Full text
Abstract:
Con questa dissertazione di tesi miro ad illustrare i risultati della mia ricerca nel campo del Semantic Publishing, consistenti nello sviluppo di un insieme di metodologie, strumenti e prototipi, uniti allo studio di un caso d‟uso concreto, finalizzati all‟applicazione ed alla focalizzazione di Lenti Semantiche (Semantic Lenses).
APA, Harvard, Vancouver, ISO, and other styles
47

Bernabeu, Llinares Miguel Oscar. "An open source HPC-enabled model of cardiac defibrillation of the human heart." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:9ca44896-8873-4c91-9358-96744e28d187.

Full text
Abstract:
Sudden cardiac death following cardiac arrest is a major killer in the industrialised world. The leading cause of sudden cardiac death are disturbances in the normal electrical activation of cardiac tissue, known as cardiac arrhythmia, which severely compromise the ability of the heart to fulfill the body's demand of oxygen. Ventricular fibrillation (VF) is the most deadly form of cardiac arrhythmia. Furthermore, electrical defibrillation through the application of strong electric shocks to the heart is the only effective therapy against VF. Over the past decades, a large body of research has dealt with the study of the mechanisms underpinning the success or failure of defibrillation shocks. The main mechanism of shock failure involves shocks terminating VF but leaving the appropriate electrical substrate for new VF episodes to rapidly follow (i.e. shock-induced arrhythmogenesis). A large number of models have been developed for the in silico study of shock-induced arrhythmogenesis, ranging from single cell models to three-dimensional ventricular models of small mammalian species. However, no extrapolation of the results obtained in the aforementioned studies has been done in human models of ventricular electrophysiology. The main reason is the large computational requirements associated with the solution of the bidomain equations of cardiac electrophysiology over large anatomically-accurate geometrical models including representation of fibre orientation and transmembrane kinetics. In this Thesis we develop simulation technology for the study of cardiac defibrillation in the human heart in the framework of the open source simulation environment Chaste. The advances include the development of novel computational and numerical techniques for the solution of the bidomain equations in large-scale high performance computing resources. More specifically, we have considered the implementation of effective domain decomposition, the development of new numerical techniques for the reduction of communication in Chaste's finite element method (FEM) solver, and the development of mesh-independent preconditioners for the solution of the linear system arising from the FEM discretisation of the bidomain equations. The developments presented in this Thesis have brought Chaste to the level of performance and functionality required to perform bidomain simulations with large three-dimensional cardiac geometries made of tens of millions of nodes and including accurate representation of fibre orientation and membrane kinetics. This advances have enabled the in silico study of shock-induced arrhythmogenesis for the first time in the human heart, therefore bridging an important gap in the field of cardiac defibrillation research.
APA, Harvard, Vancouver, ISO, and other styles
48

Pearce, Michael Bruce. "Development and evaluation of a secure web gateway with messaging functionality : utilizing existing ICAP and open-source tools to notify and protect end users from Internet security threats." Thesis, University of Canterbury. Computer Science and Software Engineering, 2010. http://hdl.handle.net/10092/5457.

Full text
Abstract:
Secure web gateways aim to protect end user systems against web based threats. Many proprietary commercial systems exist. However, their mechanisms of operation are not generally publicly known. This project undertook development and evaluation of an open source and standards based secure web gateway. The proof of concept system developed uses a combination of open source software (including the Greasyspoon ICAP Server, Squid HTTP proxy, and Clam Antivirus) and Java modules installed on the ICAP server to perform various security tasks that range from simple (such as passive content insertion) to more advanced (such as active content alteration). The makeup of the proof of concept system and the evaluation methodology for both effectiveness and performance are discussed. The effectiveness was tested using comparative analysis of groups of self-browsing high interaction client honey pots (employing a variety of security measures) and recording different system alteration rates. Performance was tested across a wide range of variables to determine the failure conditions and optimal set up for the components used. The system developed met the majority of the goals set, and results from testing indicate that there was an improvement in infection rates over unprotected systems. Performance levels attained were suitable for small scale deployments, but optimization is necessary for larger scale deployments.
APA, Harvard, Vancouver, ISO, and other styles
49

Musasa, Mutombo Mike. "Evaluation of embedded processors for next generation asic : Evaluation of open source Risc-V processors and tools ability to perform packet processing operations compared to Arm Cortex M7 processors." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-299656.

Full text
Abstract:
Nowadays, network processors are an integral part of information technology. With the deployment of 5G network ramping up around the world, numerous new devices are going to take advantage of their processing power and programming flexibility. Contemporary information technology providers of today such as Ericsson, spend a great amount of financial resources on licensing deals to use processors with proprietary instruction set architecture designs from companies like Arm holdings. There is a new non-proprietary instruction set architecture technology being developed known as Risc-V. There are many open source processors based on Risc-V architecture, but it is still unclear how well an open-source Risc-V processor performs network packet processing tasks compared to an Arm-based processor. The main purpose of this thesis is to design a test model simulating and evaluating how well an open-source Risc-V processor performs packet processing compared to an Arm Cortex M7 processor. This was done by designing a C code simulating some key packet processing functions processing 50 randomly generated 72 bytes data packets. The following functions were tested: framing, parsing, pattern matching, and classification. The code was ported and executed in both an Arm Cortex M7 processor and an emulated open source Risc-V processor. A working packet processing test code was built, evaluated on an Arm Cortex M7 processor. Three different open-source Risc-V processors were tested, Arianne, SweRV core, and Rocket-chip. The execution time of both cases was analyzed and compared. The execution time of the test code on Arm was 67, 5 ns. Based on the results, it can be argued that open source Risc-V processor tools are not fully reliable yet and ready to be used for packet processing applications. Further evaluation should be performed on this topic, with a more in-depth look at the SweRV core processor, at physical open-source Risc-V hardware instead of emulators.<br>Nätverksprocessorer är en viktig byggsten av informationsteknik idag. I takt med att 5G nätverk byggs ut runt om i världen, många fler enheter kommer att kunna ta del av deras kraftfulla prestanda och programerings flexibilitet. Informationsteknik företag som Ericsson, spenderarmycket ekonomiska resurser på licenser för att kunna använda proprietära instruktionsuppsättnings arkitektur teknik baserade processorer från ARM holdings. Det är väldigt kostam att fortsätta köpa licenser då dessa arkitekturer är en byggsten till designen av många processorer och andra komponenter. Idag finns det en lovande ny processor instruktionsuppsättnings arkitektur teknik som inte är licensierad så kallad Risc-V. Tack vare Risc-V har många propietära och öppen källkod processor utvecklats idag. Det finns dock väldigt lite information kring hur bra de presterar i nätverksapplikationer är känt idag. Kan en öppen-källkod Risc-V processor utföra nätverks databehandling funktioner lika bra som en proprietär Arm Cortex M7 processor? Huvudsyftet med detta arbete är att bygga en test model som undersöker hur väl en öppen-källkod Risc-V baserad processor utför databehandlings operationer av nätverk datapacket jämfört med en Arm Cortex M7 processor. Detta har utförts genom att ta fram en C programmeringskod som simulerar en mottagning och behandling av 72 bytes datapaket. De följande funktionerna testades, inramning, parsning, mönster matchning och klassificering. Koden kompilerades och testades i både en Arm Cortex M7 processor och 3 olika emulerade öppen källkod Risc-V processorer, Arianne, SweRV core och Rocket-chip. Efter att ha testat några öppen källkod Risc-V processorer och använt test koden i en ArmCortex M7 processor, kan det hävdas att öppen-källkod Risc-V processor verktygen inte är tillräckligt pålitliga än. Denna rapport tyder på att öppen-källkod Risc-V emulatorer och verktygen behöver utvecklas mer för att användas i nätverks applikationer. Det finns ett behov av ytterligare undersökning inom detta ämne i framtiden. Exempelvis, en djupare undersökning av SweRV core processor, eller en öppen-källkod Risc-V byggd hårdvara krävs.
APA, Harvard, Vancouver, ISO, and other styles
50

Panteleewa, Marina. "Schöne Neue Welt für Übersetzer Crowdsourcing / Kollektive Übersetzungsarbeit Und die Weisheit der Vielen." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/14171/.

Full text
Abstract:
Was bedeutet kollektive Übersetzungsarbeit und welche technologischen und sozialen Entwicklungen, bzw. Tendenzen der letzten Jahre haben sie ermöglicht? Die vorliegende Arbeit untersucht die Beweggründe von Amateurübersetzern und versucht durch die Verdeutlichung „wer was im Netz verrichtet“, sich von dem, was im Internet als „die Weisheit der Vielen“, bzw. Crowdsourcing bezeichnet wird, ein Bild zu machen. Weiterhin wird untersucht, wie Übersetzer von der technologischen Entwicklungswelle des Web 2.0 profitieren und welche allgemeinen positiven Tendenzen in der Übersetzungslandschaft, dank der Mitwirkung von Kollaborations-Technologien und -Arbeitstendenzen im Netz, schon zur Praxis gehören. Der Schwerpunkt der Betrachtung liegt in der Charakterisierung der webbasierten, elektronischen Ressource für computerunterstützte Übersetzungen „Matecat“.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography