To see the other types of publications on this topic, follow the link: Informational and analytical software.

Dissertations / Theses on the topic 'Informational and analytical software'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Informational and analytical software.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Fotrousi, Farnaz, and Katayoun Izadyan. "Analytics-based Software Product Planning." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-5053.

Full text
Abstract:
Context. Successful software product management concerns about developing right software products for right markets at the right time. The product manager, who carries responsibilities of planning, requires but does not always have access to high-quality information for making the best possible planning decisions. The following master thesis concentrates on proposing a solution that supports planning of a software product by means of analytics. Objectives. The aim of the master thesis is to understand potentials of analytics in product planning decisions in a SaaS context. This thesis focuses on SaaS based analytics used for portfolio management, product roadmapping, and release planning and specify how the analytics can be utilized for planning of a software product. Then the study devises an analytics-based method to enable software product planning. Methods. The current study was designed with a mixed methodology approach, which includes the literature review and survey researches as well as case study under the framework of the design science. Literature review was conducted to identify product planning decisions and the measurements that support them. A total of 17 interview based surveys were conducted to investigate the impact of analytics on product planning decisions in product roadmapping context. The result of the interviews ended in an analytics-based planning method provided under the framework of design science. The designed analytics-based method was validated by a case study in order to measure the effectiveness of the solution. Results. The identified product planning decisions were summarized and categorized into a taxonomy of decisions divided by portfolio management, roadmapping, and release planning. The identified SaaS-based measurements were categorized into six categories and made a taxonomy of measurements. The result of the survey illustrated that importance functions of the measurement- categories are not much different for planning-decisions. In the interviews 61.8% of interviewees selected “very important” for “Product”, 58.8% for “Feature”, and 64.7% for “Product healthiness” categories. For “Referral sources” category, 61.8% of responses have valuated as “not important”. Categories of “Technologies and Channels” and “Usage Pattern” have been rated majorly “important” by 47.1% and 32.4% of the corresponding responses. Also the results showed that product use, feature use, users of feature use, response time, product errors, and downtime are the first top measurement- attributes that a product manager prefers to use for product planning. Qualitative results identified “product specification, product maturity and goal” as the effected factors on analytics importance for product planning and in parallel specified strengths and weaknesses of analytical planning from product managers’ perspectives. Analytics-based product planning method was developed with eleven main process steps, using the measurements and measurement scores resulted from the interviews, and finally got validated in a case. The method can support all three assets of product planning (portfolio management, roadmapping, and release planning), however it was validated only for roadmapping decisions in the current study. SaaS-based analytics are enablers for the method, but there might be some other analytics that can assist to take planning decisions as well. Conclusion. The results of the interviews on the roadmapping decisions indicated that different planning decisions consider similar importance for measurement-categories to plan a software product. Statistics about feature use, product use, response time, users, error and downtime have been recognized as the most important measurements for planning. Analytics increase knowledge about product usability and functionality, and also can assist to improve problem handling and client-side technologies. But it has limitations regarding to receiving formed-based customer feedback, handling development technologies and also interpreting some measurements in practice. Immature products are not able to use analytics. To create, remove, or enhance a feature, the data trend provides a wide view of feature desirability in the current or even future time and clarifies how these changes can impact decision making. Prioritizing features can be performed for the features in the same context by comparing their measurement impacts. The analytics-based method covers both reactive and proactive planning.
APA, Harvard, Vancouver, ISO, and other styles
2

Štacha, Jan. "Analýza a návrh informačního systému elektronického vzdělávání." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2007. http://www.nusl.cz/ntk/nusl-236840.

Full text
Abstract:
This thesis is focused on IBM Rational Unified Process methodology, which represents complex and robust aproach to software development and software lifecycle. This methodology is well described and every step of software lifecycle is predictible. Thats the reason why is becoming used in many software development organizations. The main goal of this thesis is deep description of this methodology and creation common outputs of inception and elaboration phase of e-learning information system. IBM Rational Unified Process is called use-case driven aproach, thats the reason why is emphasized descripton of all use-cases.
APA, Harvard, Vancouver, ISO, and other styles
3

Kallergis, Georgios. "Business Software Engineering Processes : An Analytics Case Study." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-207138.

Full text
Abstract:
Using Information Technology (IT) solutions to automate business processes has been the norm for many organizations in recent years. Despite the great benefits an enterprise can reap from adopting such solutions, developing ones own IT solution is not as easy as it might seem. If we take into account the ever increasing complexity of modern businesses and their operating environment as well as the fast pace with which the modern world is changing, development and maintenance of such systems can easily become a daunting task. Many software development processes have been proposed over the years aiming at increasing software projects’ success rates in terms of budget, time and requirements satisfaction. In this project (code named Helium), we propose a simple software development process customised specifically for Customer Value (CV) and we apply this process to develop a novel distributed IT system that automates the main business processes of the company. Our main goal is to reduce the operational costs of Customer Value (CV) and increase its capacity and provided Quality of Service (QoS). The ”start-up” nature of the company is taken under consideration, since it introduces a considerable amount of uncertainty, as well as the fact that the initial set of projects in their project road-map are going to be thesis projects carried out by students and not experienced professionals. The proposed distributed architecture aims at providing maintenance, expansion, performance and scalability benefits. A basic set of measurements, carried out on the implemented system, validate the correctness of our approach with respect to performance and a set of interviews carried out with senior developers and managers validate the importance of the benefits of the process and architecture from a business standpoint.
APA, Harvard, Vancouver, ISO, and other styles
4

Rodríguez, Martínez Cecilia. "Software quality studies using analytical metric analysis." Thesis, KTH, Kommunikationssystem, CoS, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-120325.

Full text
Abstract:
Today engineering companies expend a large amount of resources on the detection and correction of the bugs (defects) in their software. These bugs are usually due to errors and mistakes made by programmers while writing the code or writing the specifications. No tool is able to detect all of these bugs. Some of these bugs remain undetected despite testing of the code. For these reasons, many researchers have tried to find indicators in the software’s source codes that can be used to predict the presence of bugs. Every bug in the source code is a potentially failure of the program to perform as expected. Therefore, programs are tested with many different cases in an attempt to cover all the possible paths through the program to detect all of these bugs. Early prediction of bugs informs the programmers about the location of the bugs in the code. Thus, programmers can more carefully test the more error prone files, and thus save a lot of time by not testing error free files. This thesis project created a tool that is able to predict error prone source code written in C++. In order to achieve this, we have utilized one predictor which has been extremely well studied: software metrics. Many studies have demonstrated that there is a relationship between software metrics and the presence of bugs. In this project a Neuro-Fuzzy hybrid model based on Fuzzy c-means and Radial Basis Neural Network has been used. The efficiency of the model has been tested in a software project at Ericsson. Testing of this model proved that the program does not achieve high accuracy due to the lack of independent samples in the data set. However, experiments did show that classification models provide better predictions than regression models. The thesis concluded by suggesting future work that could improve the performance of this program.<br>Idag spenderar ingenjörsföretag en stor mängd resurser på att upptäcka och korrigera buggar (fel) i sin mjukvara. Det är oftast programmerare som inför dessa buggar på grund av fel och misstag som uppkommer när de skriver koden eller specifikationerna. Inget verktyg kan detektera alla dessa buggar. Några av buggarna förblir oupptäckta trots testning av koden. Av dessa skäl har många forskare försökt hitta indikatorer i programvarans källkod som kan användas för att förutsäga förekomsten av buggar. Varje fel i källkoden är ett potentiellt misslyckande som gör att applikationen inte fungerar som förväntat. För att hitta buggarna testas koden med många olika testfall för att försöka täcka alla möjliga kombinationer och fall. Förutsägelse av buggar informerar programmerarna om var i koden buggarna finns. Således kan programmerarna mer noggrant testa felbenägna filer och därmed spara mycket tid genom att inte behöva testa felfria filer. Detta examensarbete har skapat ett verktyg som kan förutsäga felbenägen källkod skriven i C ++. För att uppnå detta har vi utnyttjat en välkänd metod som heter Software Metrics. Många studier har visat att det finns ett samband mellan Software Metrics och förekomsten av buggar. I detta projekt har en Neuro-Fuzzy hybridmodell baserad på Fuzzy c-means och Radial Basis Neural Network använts. Effektiviteten av modellen har testats i ett mjukvaruprojekt på Ericsson. Testning av denna modell visade att programmet inte Uppnå hög noggrannhet på grund av bristen av oberoende urval i datauppsättningen. Men gjordt experiment visade att klassificering modeller ger bättre förutsägelser än regressionsmodeller. Exjobbet avslutade genom att föreslå framtida arbetet som skulle kunna förbättra detta program.<br>Actualmente las empresas de ingeniería derivan una gran cantidad de recursos a la detección y corrección de errores en sus códigos software. Estos errores se deben generalmente a los errores cometidos por los desarrolladores cuando escriben el código o sus especificaciones.  No hay ninguna herramienta capaz de detectar todos estos errores y algunos de ellos pasan desapercibidos tras el proceso de pruebas. Por esta razón, numerosas investigaciones han intentado encontrar indicadores en los códigos fuente del software que puedan ser utilizados para detectar la presencia de errores. Cada error en un código fuente es un error potencial en el funcionamiento del programa, por ello los programas son sometidos a exhaustivas pruebas que cubren (o intentan cubrir) todos los posibles caminos del programa para detectar todos sus errores. La temprana localización de errores informa a los programadores dedicados a la realización de estas pruebas sobre la ubicación de estos errores en el código. Así, los programadores pueden probar con más cuidado los archivos más propensos a tener errores dejando a un lado los archivos libres de error. En este proyecto se ha creado una herramienta capaz de predecir código software propenso a errores escrito en C++. Para ello, en este proyecto se ha utilizado un indicador que ha sido cuidadosamente estudiado y ha demostrado su relación con la presencia de errores: las métricas del software. En este proyecto un modelo híbrido neuro-disfuso basado en Fuzzy c-means y en redes neuronales de función de base radial ha sido utilizado. La eficacia de este modelo ha sido probada en un proyecto software de Ericsson. Como resultado se ha comprobado que el modelo no alcanza una alta precisión debido a la falta de muestras independientes en el conjunto de datos y los experimentos han mostrado que los modelos de clasificación proporcionan mejores predicciones que los modelos de regresión. El proyecto concluye sugiriendo trabajo que mejoraría el funcionamiento del programa en el futuro.
APA, Harvard, Vancouver, ISO, and other styles
5

Breda, Lorenzo <1995&gt. "Business Analytics for the Energy Management Information System software provider: a case study." Master's Degree Thesis, Università Ca' Foscari Venezia, 2021. http://hdl.handle.net/10579/19310.

Full text
Abstract:
L'analisi dei dati aziendali comprende una serie di concetti e metodi per individuare pattern nascosti nei dati operativi aziendali. Sebbene siano disponibili diverse soluzioni di questo tipo, il successo dell’analisi risiede nella comprensione del contesto aziendale e nella definizione degli obiettivi. Questa tesi presenta lo sviluppo di una struttura analitica concettuale per fare Business Analytics, all’interno di una compagnia fornitrice di sistemi di gestione e informazione dell’energia (EMIS). I risultati dell’analisi vengo poi illustrati. I sistemi EMIS sono strumenti per monitorare il consumo di energia utilizzato nei programmi di gestione ed efficienza energetica. Negli ultimi anni, tali programmi hanno acquisito rilevanza in organizzazioni che operano in diversi settori economici. Inoltre, si sta consolidando il settore delle attività specialistiche di gestione e consulenza energetica. All'interno di un quadro di mercato così complesso, è proficuo per il provider EMIS rintracciare le fonti di crescita. Pertanto, lo studio dei consumatori e delle vendite con l’obiettivo di identificare le direttrici della crescita posso rivelare informazioni rilevanti, che permettono alla compagnia di capire il proprio posizionamento. La prima fase della ricerca si concentra sull’identificazione delle caratteristiche chiave legate alle unità di vendita con lo scopo di sviluppare una struttura concettuale con quale poter estrarre informazioni dai dati. La seconda fase comprende l’organizzazione e lo svolgimento dell’analisi. I risultati presentati consistono nelle informazioni visualizzate attraverso dei grafici.
APA, Harvard, Vancouver, ISO, and other styles
6

Kumar, Hemant, University of Western Sydney, and of Science Technology and Environment College. "Software analytical tool for assessing cardiac blood flow parameters." THESIS_FSTA_XXX_Kumar_H.xml, 2001. http://handle.uws.edu.au:8081/1959.7/392.

Full text
Abstract:
Introduction of Doppler ultrasound techniques into the Intensive Care setting has revolutionised the way haemodynamic status is monitored in the critically ill. However, in order to increase the usefulness of these techniques, the Doppler signal and its spectrum need to be further analysed in ways to facilitate a better clinical response. Extensive processing of the Doppler spectrum on Diagnostic ultrasound machines is limited by the real time performance considerations. It was therefore proposed that the spectral information from these systems be extracted off-line and full set of analytical tools be made available to evaluate this information. This was achieved by creating an integrated and modular software tool called Spectron, which was intended as an aid in the overall management of the patients. The modular nature of Spectron was intended to ensure that new analytical tools and techniques could be easily added and tested. The software provides its users with considerable latitude in choosing various data acquisition and analysis parameters to suit various clinical situations and patient requirements. Spectron was developed under the Windows environment to provide a user friendly interface and to address a range of programming problems such as memory management and the size of the colour palettes. Spectron is able to detect the maximal velocities and compute the mean and median velocities. Relative increases in maximal velocities in cardiac blood flows after the administration of inotropic drugs have been shown in the pilot studies that were conducted. Spectron is able to help in obtaining estimates of the aortic blood flows and in other applications such measuring vascular impedance. Stenotic blood flows can be detected by using the spectral broadening index and blood flow characteristics can be studied by using various blood flow indices. Thus, this project attempted to help in patient management by providing clinicians with a range of blood flow parameters and has succeeded in meeting its objective to a large extent<br>Master of Engineering (Hons)
APA, Harvard, Vancouver, ISO, and other styles
7

Kumar, Hemant. "Software analytical tool for assessing cardiac blood flow parameters." Thesis, View thesis, 2001. http://handle.uws.edu.au:8081/1959.7/392.

Full text
Abstract:
Introduction of Doppler ultrasound techniques into the Intensive Care setting has revolutionised the way haemodynamic status is monitored in the critically ill. However, in order to increase the usefulness of these techniques, the Doppler signal and its spectrum need to be further analysed in ways to facilitate a better clinical response. Extensive processing of the Doppler spectrum on Diagnostic ultrasound machines is limited by the real time performance considerations. It was therefore proposed that the spectral information from these systems be extracted off-line and full set of analytical tools be made available to evaluate this information. This was achieved by creating an integrated and modular software tool called Spectron, which was intended as an aid in the overall management of the patients. The modular nature of Spectron was intended to ensure that new analytical tools and techniques could be easily added and tested. The software provides its users with considerable latitude in choosing various data acquisition and analysis parameters to suit various clinical situations and patient requirements. Spectron was developed under the Windows environment to provide a user friendly interface and to address a range of programming problems such as memory management and the size of the colour palettes. Spectron is able to detect the maximal velocities and compute the mean and median velocities. Relative increases in maximal velocities in cardiac blood flows after the administration of inotropic drugs have been shown in the pilot studies that were conducted. Spectron is able to help in obtaining estimates of the aortic blood flows and in other applications such measuring vascular impedance. Stenotic blood flows can be detected by using the spectral broadening index and blood flow characteristics can be studied by using various blood flow indices. Thus, this project attempted to help in patient management by providing clinicians with a range of blood flow parameters and has succeeded in meeting its objective to a large extent
APA, Harvard, Vancouver, ISO, and other styles
8

Kumar, Hemant. "Software analytical tool for assessing cardiac blood flow parameters /." View thesis, 2001. http://library.uws.edu.au/adt-NUWS/public/adt-NUWS20030724.122149/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bathla, Rajender, and Anil Kapil. "Analytical Scenario of Software Testing Using Simplistic Cost Model." IJCSN, 2012. http://hdl.handle.net/10150/219531.

Full text
Abstract:
Software testing is the process of executing a program with the intention of finding errors in the code. It is the process of exercising or evaluating a system or system component by manual automatic means to verify that it satisfies specified requirements or to identify differences between expected and actual results [4]. Software Testing should not be a distinct phase in System development but should be applicable throughout the design development and maintenance phases. ‘Software Testing is often used in association with terms verification & validation ‘Software testing is the process of executing software in a controlled manner, in order to answer the question: Does the software behave as specified. One way to ensure system‘s responsibility is to extensively test the system. Since software is a system component it requires a testing process also.<br>Software can be tested either manually or automatically. The two approaches are complementary: automated testing can perform a huge number of tests in short time or period, whereas manual testing uses the knowledge of the testing engineer to target testing to the parts of the system that are assumed to be more error-prone. Despite this contemporary, tools for manual and automatic testing are usually different, leading to decreased productivity and reliability of the testing process. Auto Test is a testing tool that provides a “best of both worlds” strategy: it integrates developers’ test cases into an automated process of systematic contractdriven testing. This allows it to combine the benefits of both approaches while keeping a simple interface, and to treat the two types of tests in a unified fashion: evaluation of results is the same, coverage measures are added up, and both types of tests can be saved in the same format. The objective of this paper is to discuss the Importance of Automation tool with associate to software testing techniques in software engineering. In this paper we provide introduction of software testing and describe the CASE tools. The solution of this problem leads to the new approach of software development known as software testing in the IT world. Software Test Automation is the process of automating the steps of manual test cases using an automation tool or utility to shorten the testing life cycle with respect to time.
APA, Harvard, Vancouver, ISO, and other styles
10

D’Avila, Leandro Ferreira. "SW-Context : um modelo para software analytics baseado em sensibilidade ao contexto." Universidade do Vale do Rio dos Sinos, 2017. http://www.repositorio.jesuita.org.br/handle/UNISINOS/6285.

Full text
Abstract:
Submitted by JOSIANE SANTOS DE OLIVEIRA (josianeso) on 2017-05-23T16:15:28Z No. of bitstreams: 1 Leandro Ferreira D’Avila_.pdf: 2516496 bytes, checksum: ce577684d579d6f920b919aadf28bdb7 (MD5)<br>Made available in DSpace on 2017-05-23T16:15:28Z (GMT). No. of bitstreams: 1 Leandro Ferreira D’Avila_.pdf: 2516496 bytes, checksum: ce577684d579d6f920b919aadf28bdb7 (MD5) Previous issue date: 2017-02-22<br>CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior<br>PROSUP - Programa de Suporte à Pós-Gradução de Instituições de Ensino Particulares<br>Diariamente, desenvolvedores de software precisam se envolver com atividades de manutenção para adaptar aplicações existentes a novos cenários e necessidades, como por exemplo, novas funcionalidades, correções de defeitos e requerimentos legais. Entretanto, algumas questões organizacionais podem interferir nas atividades dos desenvolvedores impactando na qualidade e manutenabilidade do software produzido. Grande volume de documentação obsoleta, dificuldades na utilização desta documentação, dependências entre módulos de software e especialistas que deixam as empresas levando o conhecimento de determinados módulos e/ou sistemas são fatores determinantes para o sucesso dos projetos. Uma das formas de mitigar o impacto destas questões seria a disponibilização de informações úteis referentes aos módulos ou artefatos de software de forma qualitativa. A disponibilização destas informações propicia um melhor entendimento do desenvolvedor em relação aos aspectos que cercam o software e o seu ambiente. De acordo com a natureza das informações disponibilizadas, os desenvolvedores podem adquirir informações relevantes sobre o softwareem questão. Essa dissertação apresenta o SW-Context, um modelo que permite a combinação de diferentes informações relacionadas a artefatos de software, a fim de aprimorar a consciência situacional dos desenvolvedores nas atividades de desenvolvimento e manutenção. Desta forma, os principais desafios do modelo são: a definição de quais informações devem compor o contexto para software, o armazenamento estruturado destas informações em históricos de contextos e, finalmente, a análise e disponibilização destas informações de contexto, de forma que possam auxiliar a atividade de desenvolvimento e manutenção de software, utilizando o conceito SoftwareAnalytics. Foi implementado um protótipo contendo os principais conceitos do modelo proposto. Este protótipo utilizou as informações contextuais de aplicações reais de uma empresa de desenvolvimento de software e foi avaliado através de um estudo de caso, onde 12 desenvolvedores o utilizaram pelo período de um mês em suas atividades diárias. Ao final deste período, os desenvolvedores responderam um questionário que abordou a utilidade da ferramenta e a facilidade de uso percebida. A avaliação do modelo obteve respostas com percentuais satisfatórios tanto em relação à facilidade de uso percebida quanto à utilidade do sistema. Pode-se avaliar que a consolidação das informações contextuais em um local único e a disponibilização qualitativa das informações correlacionadas, através de dashboard, atingiu o objetivo de melhorar a consciência situacional dos desenvolvedores nas atividades de manutenção.<br>Developers need to deal recurrently with the maintenance activities on existing applications in order to adapt them to new scenarios and needs, for example, new features, bug fixing and legal changes. Besides that, developers often deal with organization factors with a potential impact on the success or failure of software development projects. Some of these organization factors are: large amount of old poorly documented software, many interdependencies between software modules and expert developers who left the company. A way to mitigate the impact of these factors on software correctness and maintainability can be providing useful information regarding the context of code or application under development using the analytics approach. The availability of this information provides a better understanding of the developer in relation to issues surrounding the software and its environment. SW-Context aims to allow a combination of different information related to software artifacts in order to improve the situational awareness of developers on development and maintenance activities. On this way, the main challenges of the model are: a definition of what information must compose software context, structured storage of the contextual information and, finally, the analysis and availability of this context information in a way to help the development and maintenance activities, using the Software Analytics concept. A prototype was implemented containing the main concepts of the proposed model. This prototype was prepared with the contextual information of actual applications under development by a software company and the prototype was evaluated through a case study, where 12 developers used it in their daily activities. By the end of this period, the developers responded a questionnaire, in which the usefulness and the ease of use were measured. The evaluation of the model obtained answers with percentage well placed both in relation to the ease of use as to the usefulness of the system. It can be considered that the consolidation of the contextual information in a single location and the availability of this correlated information in a graphical way, through a dashboard, reached the objective of improving the situational awareness of software developers in maintenance activities.
APA, Harvard, Vancouver, ISO, and other styles
11

Колеснікова, Марія Вікторівна, Мария Викторовна Колесникова, Mariia Viktorivna Kolesnikova та К. І. Губа. "Адміністративно-правові засади здійснення інформаційно-аналітичного забезпечення Національної поліції в період реформування". Thesis, Класичний приватний університет, 2018. http://essuir.sumdu.edu.ua/handle/123456789/72333.

Full text
Abstract:
Інформація, її одержання, аналіз, синтез складають основу людської діяльності. Діяльність працівників поліції також пов’язана з величезними об’ємами інформації, яка в багатьох випадках є оперативною, службовою, таємною і навіть становить державну таємницю. Тому в контексті цього, інформаційноаналітичне забезпечення діяльності органів Національної поліції України нерозривно пов’язано з інформаційною безпекою держави загалом.<br>Информация, ее получение, анализ, синтез составляют основу человеческой деятельности. Деятельность сотрудников полиции также связана с огромными объемами информации, которая во многих случаях является оперативной, служебной, тайной и даже составляет государственную тайну. Поэтому в контексте этого информацийно аналитичне обеспечения деятельности органов Национальной полиции Украины неразрывно связано с информационной безопасностью государства в целом.<br>Information, its reception, analysis, synthesis form the basis of human activity. The activities of police officers are also associated with huge amounts of information, which in many cases is operational, service, secret and even state secrets. Therefore, in the context of this, informational and analytical support of the activities of the National Police of Ukraine is inextricably linked with the information security of the state as a whole.
APA, Harvard, Vancouver, ISO, and other styles
12

A, Taher Ali. "BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)." Thesis, KTH, Byggnadsteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-184856.

Full text
Abstract:
This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC).<br>BIM and Structural BIM (S-BIM)
APA, Harvard, Vancouver, ISO, and other styles
13

Wang, Yang. "Digital film dosimetry in radiotherapy and the development of analytical applications software." Access electronically, 2005. http://www.library.uow.edu.au/adt-NWU/public/adt-NWU20060223.150107/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Goosen, Ryno Johannes. "Sense, signal and software : a sensemaking analysis of meaning in early warning systems." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/96132.

Full text
Abstract:
Thesis (MPhil)--Stellenbosch University, 2014.<br>ENGLISH ABSTRACT: This thesis considers the contribution that Karl Weick’s notion of sensemaking can make to an improved understanding of weak signals, cues, warning analysis, and software within early warning systems. Weick’s sensemaking provides a framework through which the above mentioned concepts are discussed and analysed. The concepts of weak signals, early warning systems, and Visual Analytics are investigated from within current business and formal intelligence viewpoints. Intelligence failure has been a characteristic of events such as 9/11, the recent financial crisis triggered by the collapse of Lehman Brothers, and the so-called Arab Spring. Popular methodologies such as early warning analysis, weak signal analysis and environmental scanning employed within both the business and government sphere failed to provide adequate early warning in many of these events. These failures warrant renewed attention as to what improvements can be made and how new technology can enhance early warning analysis. Chapter One is introductory and states the research question, methodology, and delimits the thesis. Chapter Two sets the scene by investigating current conceptions of the main constructs. Chapter Three explores Weick’s theory of sensemaking, and provides the analytical framework against which these concepts are then analysed in Chapter Four. The emphasis is directed towards the extent of integration of frames within the analysis phase of early warning systems and how frames may be incorporated within the theoretical foundation of Visual Analytics to enhance warning systems. The findings of this thesis suggest that Weick’s conceptualisation of sensemaking provide conceptual clarity to weak signal analysis in that Weick’s “seed” metaphor, representing the embellishment and elaboration of cues, epitomizes the progressive nature of weak signals. The importance of Weick’s notion of belief driven sensemaking, in specific the role of expectation in the elaboration of frames, and discussed and confirmed by various researchers in different study areas, is a core feature underlined in this thesis. The centrality of the act of noticing and the effect that framing and re-framing has thereon is highlighted as a primary notion in the process of not only making sense of warning signals but identifying them in the first place. This ties in to the valuable contribution Weick’s sensemaking makes to understanding the effect that a specification has on identifying transients and signals in the resulting visualization in Visual Analytic software.<br>AFRIKAANSE OPSOMMING: Hierdie tesis ondersoek hoe Karl Weick se konsep van singewing ons insig teenoor swak seine, tekens, waarskuwingsanalise en sagteware binne vroeë waarskuwingstelsels verbeter. Weick se bydrae verskaf ‘n raamwerk waarbinne hierdie konsepte geanaliseer en ondersoek kan word. Die konsep van swak seine, vroeë-waarskuwing en visuele analise word binne huidige besigheidsuitgangspunte, en die formele intelligensie arena ondersoek. Die mislukking van intelligensie is kenmerkend van gebeure soos 9/11, die onlangse finansiёle krisis wat deur die ondergang van Lehman Brothers ingelei is, en die sogenaamde “Arab Spring”. Hierdie gebeure het ‘n wêreldwye opskudding op ekonomiese en politiese vlak veroorsaak. Moderne metodologieё soos vroeë waarskuwingsanalise, swaksein-analise en omgewingsaanskouing binne regerings- en besigheidsverband het duidelik in hul doelstelling misluk om voortydig te waarsku oor hierdie gebeurtenisse. Dit is juis hierdie mislukkings wat dit noodsaaklik maak om meer aandag te skenk aan hierdie konsepte, asook nuwe tegnologie wat dit kan verbeter. Hoofstuk Een is inleidend en stel die navorsingsvraagstuk, doelwitte en afbakkening. Hoofstuk Twee lê die fondasie van die tesis deur ‘n ondersoek van die hoof konsepte. Hoofstuk Drie verskaf die teoretiese raamwerk, die van Weick se singewingsteorie, waarteen die hoof konsepte in Hoofstuk Twee ondersoek word in Hoofstuk Vier. Klem word gelê op die diepte van integrasie en die toepassing van raamwerke in die analisefase van vroeё waarskuwingstelsels en hoe dit binne die teoretiese beginsels van visuele analise geïnkorporeer word. Die bevindinge van hierdie tesis spreek die feit aan dat Weick se konsepsualisering van singewing konseptuele helderheid rakende die begrip “swakseine” verskaf. In hierdie verband verteenwoordig Weick se “saad”- metafoor die samewerking en uitbouing van seine en “padpredikante” wat die progressiewe aard van swakseine weerspieёl. Die kernbeskouing van hierdie tesis is die belangrikheid van Weick se geloofsgedrewesingewing, veral die uitkoms van die bou van raamwerke asook die bespreking hiervan deur verskeie navorsers. Die belangrikheid van die aksie om seine op te merk, en die effek wat dit op die herbeskouing van raamwerke het, asook die raaksien daarvan in die eerste plek word beklemtoon. Laasgenoemde dui ook aan tot watter mate Weick se singewingsteorie ‘n bydrae maak tot visuele analise veral in ons begrip van die gevolg wat data of inligtingspesifikasie het op die identifisering van seine en onsinnighede in visualisering binne visuele analise-sagteware.
APA, Harvard, Vancouver, ISO, and other styles
15

Lee, Howard. "A multi-dimensional analytical framework for hierarchical reasoning in space and time." Thesis, University of Kent, 2000. https://kar.kent.ac.uk/21962/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Koza, Jacob. "Active Analytics: Suggesting Navigational Links to Users Based on Temporal Analytics Data." UNF Digital Commons, 2019. https://digitalcommons.unf.edu/etd/892.

Full text
Abstract:
Front-end developers are tasked with keeping websites up-to-date while optimizing user experiences and interactions. Tools and systems have been developed to give these individuals granular analytic insight into who, with what, and how users are interacting with their sites. These systems maintain a historical record of user interactions that can be leveraged for design decisions. Developing a framework to aggregate those historical usage records and using it to anticipate user interactions on a webpage could automate the task of optimizing web pages. In this research a system called Active Analytics was created that takes Google Analytics historical usage data and provides a dynamic front-end system for automatically updating web page navigational elements. The previous year’s data is extracted from Google Analytics and transformed into a summarization of top navigation steps. Once stored, a responsive front-end system selects from this data a timespan of three weeks from the previous year: current, previous and next. The most frequently reached pages, or their parent pages, will have their navigational UI elements highlighted on a top-level or landing page to attempt to reduce the effort to reach those pages. The Active Analytics framework was evaluated by eliciting volunteers by randomly assigning two versions of a site, one with the framework, one without. It was found that users of the framework-enabled site were able to navigate a site more easily than the original.
APA, Harvard, Vancouver, ISO, and other styles
17

Ghodsypour, Seyed Hassan. "A decision support system for supplier selection integrating analytical hierarchy process with operations research methods." Thesis, University of Nottingham, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.337182.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Anis, Sadia Shahnoor. "A Design Choice Guideline for Software-Defined Network Control Plane Architecture using Analytical Hierarchical Process." University of Akron / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=akron1608144391722863.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Holmes, James R. "Development of Operational and Teaching Software for a Complex Analytical Instrument Using Virtual Instrument Technology." Thesis, Curtin University, 2002. http://hdl.handle.net/20.500.11937/582.

Full text
Abstract:
It is not always possible to provide students and new users of complex instrumentation with sufficient hands-on use to fully develop the required knowledge of the instrument. Access may also be limited when there is a need to develop data collection and processing procedures. One solution to this problem is to develop a simulation of the instrument in readily accessible computer software. Modern computer-based technology allows traditional instrumentation to be replaced with Virtual Instruments consisting of digital control/acquisition hardware and software that graphically represents the functions of the physical instrument.In this thesis, operating and analysis software to simulate the operation of complex analytical instrumentation was successfully developed using a numerical model of the instrument. The approach will reduce the need for machine time for operator training and the development of data collection processing procedures. In particular the thesis developed software to emulate the behaviour of a VG-354 Thermal Ionisation Mass Spectrometer. Graphical programming tools were employed to create a modular set of Virtual Instruments that formed the basis of the model. The Simulated Mass Spectrometer produced results that compared well with real data obtained from the physical instrument.Virtual Instrument peak centring and measurement modules were then developed to operate the Simulated Mass Spectrometer in peak jumping mode. Uncertainties were reduced with improved analysis techniques employing polynomial least-squares fits for peak centring and single-collector isotope ratio measurements. The techniques also have the potential to accommodate hysteresis effects in the magnetic sector analyser, further reducing uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
20

Holmes, James R. "Development of Operational and Teaching Software for a Complex Analytical Instrument Using Virtual Instrument Technology." Curtin University of Technology, Department of Applied Physics, 2002. http://espace.library.curtin.edu.au:80/R/?func=dbin-jump-full&object_id=13792.

Full text
Abstract:
It is not always possible to provide students and new users of complex instrumentation with sufficient hands-on use to fully develop the required knowledge of the instrument. Access may also be limited when there is a need to develop data collection and processing procedures. One solution to this problem is to develop a simulation of the instrument in readily accessible computer software. Modern computer-based technology allows traditional instrumentation to be replaced with Virtual Instruments consisting of digital control/acquisition hardware and software that graphically represents the functions of the physical instrument.In this thesis, operating and analysis software to simulate the operation of complex analytical instrumentation was successfully developed using a numerical model of the instrument. The approach will reduce the need for machine time for operator training and the development of data collection processing procedures. In particular the thesis developed software to emulate the behaviour of a VG-354 Thermal Ionisation Mass Spectrometer. Graphical programming tools were employed to create a modular set of Virtual Instruments that formed the basis of the model. The Simulated Mass Spectrometer produced results that compared well with real data obtained from the physical instrument.Virtual Instrument peak centring and measurement modules were then developed to operate the Simulated Mass Spectrometer in peak jumping mode. Uncertainties were reduced with improved analysis techniques employing polynomial least-squares fits for peak centring and single-collector isotope ratio measurements. The techniques also have the potential to accommodate hysteresis effects in the magnetic sector analyser, further reducing uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
21

Davis, James A. "Analytical modelling for the performance prediction and optimisation of near-neighbour structured grid hydrodynamics." Thesis, University of Warwick, 2017. http://wrap.warwick.ac.uk/92537/.

Full text
Abstract:
The advent of modern High Performance Computing (HPC) has facilitated the use of powerful supercomputing machines that have become the backbone of data analysis and simulation. With such a variety of software and hardware available today, understanding how well such machines can perform is key for both efficient use and future planning. With significant costs and multi-year turn-around times, procurement of a new HPC architecture can be a significant undertaking. In this work, we introduce one such measure to capture the performance of such machines – analytical performance models. These models provide a mathematical representation of the behaviour of an application in the context of how its various components perform for an architecture. By parameterising its workload in such a way that the time taken to compute can be described in relation to one or more benchmarkable statistics, this allows for a reusable representation of an application that can be applied to multiple architectures. This work goes on to introduce one such benchmark of interest, Hydra. Hydra is a benchmark 3D Eulerian structured mesh hydrocode implemented in Fortran, with which the explosive compression of materials, shock waves, and the behaviour of materials at the interface between components can be investigated. We assess its scaling behaviour and use this knowledge to construct a performance model that accurately predicts the runtime to within 15% across three separate machines, each with its own distinct characteristics. Further, this work goes on to explore various optimisation techniques, some of which see a marked speedup in the overall walltime of the application. Finally, another software application of interest with similar behaviour patterns, PETSc, is examined to demonstrate how different applications can exhibit similar modellable patterns.
APA, Harvard, Vancouver, ISO, and other styles
22

Li, Yan. "NEW ARTIFACTS FOR THE KNOWLEDGE DISCOVERY VIA DATA ANALYTICS (KDDA) PROCESS." VCU Scholars Compass, 2014. http://scholarscompass.vcu.edu/etd/3609.

Full text
Abstract:
Recently, the interest in the business application of analytics and data science has increased significantly. The popularity of data analytics and data science comes from the clear articulation of business problem solving as an end goal. To address limitations in existing literature, this dissertation provides four novel design artifacts for Knowledge Discovery via Data Analytics (KDDA). The first artifact is a Snail Shell KDDA process model that extends existing knowledge discovery process models, but addresses many existing limitations. At the top level, the KDDA Process model highlights the iterative nature of KDDA projects and adds two new phases, namely Problem Formulation and Maintenance. At the second level, generic tasks of the KDDA process model are presented in a comparative manner, highlighting the differences between the new KDDA process model and the traditional knowledge discovery process models. Two case studies are used to demonstrate how to use KDDA process model to guide real world KDDA projects. The second artifact, a methodology for theory building based on quantitative data is a novel application of KDDA process model. The methodology is evaluated using a theory building case from the public health domain. It is not only an instantiation of the Snail Shell KDDA process model, but also makes theoretical contributions to theory building. It demonstrates how analytical techniques can be used as quantitative gauges to assess important construct relationships during the formative phase of theory building. The third artifact is a data mining ontology, the DM3 ontology, to bridge the semantic gap between business users and KDDA expert and facilitate analytical model maintenance and reuse. The DM3 ontology is evaluated using both criteria-based approach and task-based approach. The fourth artifact is a decision support framework for MCDA software selection. The framework enables users choose relevant MCDA software based on a specific decision making situation (DMS). A DMS modeling framework is developed to structure the DMS based on the decision problem and the users' decision preferences and. The framework is implemented into a decision support system and evaluated using application examples from the real-estate domain.
APA, Harvard, Vancouver, ISO, and other styles
23

Dalci, Mustafa. "Using Google Analytics, Card Sorting And Search Statistics For Getting Insights About Metu Website." Master's thesis, METU, 2011. http://etd.lib.metu.edu.tr/upload/12613077/index.pdf.

Full text
Abstract:
websites are one of the most popular and quickest way for communicating with users and providing information. Measuring the effectiveness of website, availability of information on website and information architecture on users
APA, Harvard, Vancouver, ISO, and other styles
24

Vasudev, R. Sashin, and Ashok Reddy Vanga. "Accuracy of Software Reliability Prediction from Different Approaches." Thesis, Blekinge Tekniska Högskola, Avdelningen för för interaktion och systemdesign, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-1298.

Full text
Abstract:
Many models have been proposed for software reliability prediction, but none of these models could capture a necessary amount of software characteristic. We have proposed a mixed approach using both analytical and data driven models for finding the accuracy in reliability prediction involving case study. This report includes qualitative research strategy. Data is collected from the case study conducted on three different companies. Based on the case study an analysis will be made on the approaches used by the companies and also by using some other data related to the organizations Software Quality Assurance (SQA) team. Out of the three organizations, the first two organizations used for the case study are working on reliability prediction and the third company is a growing company developing a product with less focus on quality. Data collection was by the means of interviewing an employee of the organization who leads a team and is in the managing position for at least last 2 years.<br>svra06@student.bth.se
APA, Harvard, Vancouver, ISO, and other styles
25

Sundström, Heléne. "Analytical tools for monitoring and control of fermentation processes." Doctoral thesis, KTH, Skolan för bioteknologi (BIO), 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4531.

Full text
Abstract:
The overall objective of this work has been to adopt new developments and techniques in the area of measurement, modelling and control of fermentation processes. Flow cytometry and software sensors are techniques which were considered ready for application and the focus was set on developing tools for research aiming at understanding the relationship between measured variables and process quality parameters. In this study fed-batch cultivations have been performed with two different strains of Escherichia coli (E.coli) K12 W3110 with and without a gene for the recombinant protein promegapoietin. Inclusion body formation was followed during the process with flow cytometric detection by labelling the inclusion bodies with first an antibody against the protein promegapoietin and then a second fluorescent anti-antibody. The approach to label inclusion bodies directly in disintegrated and diluted cell slurry could be adopted as a method to follow protein production during the process, although the labelling procedure with incubation times and washings was somewhat time-consuming (1.5 h). The labelling of inclusion bodies inside the cells to follow protein production was feasible to perform, although an unexplained decrease in the relative fluorescence intensity occurred late in process. However, it is difficult to translate this qualitative measurement into a quantitative one, since a quantitative protein analysis should give data proportional to the volume, while the labelling of the spheric inclusion bodies gives a signal corresponding to the area of the body, and calibration is not possible. The methods were shown to be useful for monitoring inclusion body formation, but it seems difficult to get quantitative information from the analysis. Population heterogeneity analysis was performed, by using flow cytometry, on a cell population, which lost 80-90% viability according to viable count analysis. It was possible to show that the apparent cell death was due to cells incapable of dividing on agar plates after induction. These cells continued to produce the induced recombinant protein. It was shown that almost all cells in the population (≈97%) contained PMP, and furthermore total protein analysis of the medium indicated that only about 1% of the population had lysed. This confirms that the "non-viable" cells according to viable count by cfu analysis produced product. The software sensors XNH3 and µNH3, which utilises base titration data to estimate biomass and specific growth rate was shown to correlate well with the off-line analyses during cultivation of E. coli W3110 using minimal medium. In rich medium the µNH3 sensor was shown to give a signal that may be used as a fingerprint of the process, at least from the time of induction. The software sensor KLaC* was shown to respond to foaming in culture that probably was caused by increased air bubble dispersion. The RO/S coefficient, which describes the oxygen to substrate consumption, was shown to give a distinct response to stress caused by lowered pH and addition of the inducing agent IPTG. The software sensor for biomass was applied to a highly automated 6-unit multi-bioreactor system intended for fast process development. In this way also specific rates of substrate and oxygen consumption became available without manual sampling.<br>QC 20100819
APA, Harvard, Vancouver, ISO, and other styles
26

Ludwig, Lars. "Analytical investigations and numerical experiments for singularly perturbed convection-diffusion problems with layers and singularities using a newly developed FE-software." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-137301.

Full text
Abstract:
In the field of singularly perturbed reaction- or convection-diffusion boundary value problems the research area of a priori error analysis for the finite element method, has already been thoroughly investigated. In particular, for mesh adapted methods and/or various stabilization techniques, works have been done that prove optimal rates of convergence or supercloseness uniformly in the perturbation parameter epsilon. Commonly, however, it is assumed that the exact solution behaves nicely in that it obeys certain regularity assumptions although in general, e.g. due to corner singularities, these regularity requirements are not satisfied. So far, insufficient regularity has been met by assuming compatibility conditions on the data. The present thesis originated from the question: What can be shown if these rather unrealistic additional assumptions are dropped? We are interested in epsilon-uniform a priori estimates for convergence and superconvergence that include some regularity parameter that is adjustable to the smoothness of the exact solution. A major difficulty that occurs when seeking the numerical error decay is that the exact solution is not known. Since we strive for reliable rates of convergence we want to avoid the standard approach of the "double-mesh principle". Our choice is to use reference solutions as a substitute for the exact solution. Numerical experiments are intended to confirm the theoretical results and to bring further insights into the interplay between layers and singularities. To computationally realize the thereby arising demanding practical aspects of the finite element method, a new software is developed that turns out to be particularly suited for the needs of the numerical analyst. Its design, features and implementation is described in detail in the second part of the thesis.
APA, Harvard, Vancouver, ISO, and other styles
27

Brito, Marcos Alves de. "A utilização do software GeoGebra no ensino da geometria analítica." Universidade Federal de São Carlos, 2014. https://repositorio.ufscar.br/handle/ufscar/7078.

Full text
Abstract:
Submitted by Izabel Franco (izabel-franco@ufscar.br) on 2016-09-09T17:42:54Z No. of bitstreams: 1 DissMAB.pdf: 10280230 bytes, checksum: c39aaacff0fdfc305f288399d6e24ba3 (MD5)<br>Approved for entry into archive by Marina Freitas (marinapf@ufscar.br) on 2016-09-12T17:55:34Z (GMT) No. of bitstreams: 1 DissMAB.pdf: 10280230 bytes, checksum: c39aaacff0fdfc305f288399d6e24ba3 (MD5)<br>Approved for entry into archive by Marina Freitas (marinapf@ufscar.br) on 2016-09-12T17:55:41Z (GMT) No. of bitstreams: 1 DissMAB.pdf: 10280230 bytes, checksum: c39aaacff0fdfc305f288399d6e24ba3 (MD5)<br>Made available in DSpace on 2016-09-12T17:55:48Z (GMT). No. of bitstreams: 1 DissMAB.pdf: 10280230 bytes, checksum: c39aaacff0fdfc305f288399d6e24ba3 (MD5) Previous issue date: 2014-12-19<br>Outra<br>In this thesis we aim, through the use of a technological tool, the computer, show the possibility of teaching Mathematics causing the student to build their knowledge from concepts presented with the help of software that allow viewing the activities developed. In this sense, we use the GeoGebra software, it is a dynamic geometry software in the public domain and its application to analytic geometry content is easy to learn for the user. Thus, we present a teaching sequence using GeoGebra software in teaching Analytic Geometry, following the contents of the Curriculum Proposal of the State of São Paulo. In each of the proposed activities develop the theoretical content with exercises followed by activity in GeoGebra. At the end of each activity are some of the comments made by students.<br>Nesta dissertação temos como objetivo, a partir do uso de uma ferramenta tecnológica, o computador, mostrar a possibilidade de ensinar Matemática fazendo com que o estudante construa seu conhecimento a partir de conceitos apresentados com auxílio de softwares que possibilitem visualizar as atividades desenvolvidas. Neste sentido, utilizamos o software GeoGebra, que se trata de um software de Geometria dinâmica de domínio público e sua aplicação aos conteúdos de Geometria Analítica é de fácil aprendizado para o usuário. Desta forma, apresentamos uma sequência didática utilizando o Software GeoGebra no ensino de Geometria Analítica, seguindo os conteúdos da Proposta Curricular do Estado de São Paulo. Em cada uma das atividades propostas desenvolvemos o conteúdo teórico com exercícios seguido da atividade no GeoGebra. Ao final de cada atividade apresentamos alguns dos comentários feitos pelos estudantes.
APA, Harvard, Vancouver, ISO, and other styles
28

Lima, Edison Fernando da Silva. "Uma proposta para o ensino de cônicas com o auxílio do software Maple." Universidade Estadual da Paraíba, 2015. http://tede.bc.uepb.edu.br/tede/jspui/handle/tede/2389.

Full text
Abstract:
Submitted by Jean Medeiros (jeanletras@uepb.edu.br) on 2016-04-19T13:56:14Z No. of bitstreams: 1 PDF - Edison Fernando da Silva Lima.pdf: 9125981 bytes, checksum: 332497337ebe6b4a6871009d22dde972 (MD5)<br>Approved for entry into archive by Secta BC (secta.csu.bc@uepb.edu.br) on 2016-07-22T20:24:27Z (GMT) No. of bitstreams: 1 PDF - Edison Fernando da Silva Lima.pdf: 9125981 bytes, checksum: 332497337ebe6b4a6871009d22dde972 (MD5)<br>Made available in DSpace on 2016-07-22T20:24:34Z (GMT). No. of bitstreams: 1 PDF - Edison Fernando da Silva Lima.pdf: 9125981 bytes, checksum: 332497337ebe6b4a6871009d22dde972 (MD5) Previous issue date: 2015-05-08<br>Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES<br>In this course conclusion work we will of conic approach in the study of analytic geometry and treat procedures in order to recognize the nature of the general equation of a curve of the second degree two variables, performing the analysis of the coefficients of your equation by an algebraic process. Simplification of calculations involved as well as the graphics sketch, the mathematical application Maple 18 will be used. Our aim is to develop students’ ability to relate and better secure the maths contents, since such an approach is not worked in textbooks the high school level.<br>Neste trabalho de conclusão de curso faremos uma abordagem das cônicas no estudo da Geometria Analítica e trataremos de procedimentos com a finalidade de se reconhecer a natureza da equação geral de uma curva do segundo grau a duas variáveis, realizando a análise dos coeficientes de sua equação por um processo algébrico. Na simplificação dos cálculos envolvidos, bem como no esboço de gráficos, será usado o aplicativo matemático Maple 18. Nosso intuito é desenvolver a habilidade dos alunos em relacionar e melhor fixar os conteúdos matemáticos abordados, visto que, tal abordagem não é trabalhada nos livros didáticos a nível de ensino médio.
APA, Harvard, Vancouver, ISO, and other styles
29

Rausch, Philip. "Predicate-induced semantic prominence in online argument linking: experiments on affectedness and analytical tools." Doctoral thesis, Humboldt-Universität zu Berlin, 2018. http://dx.doi.org/10.18452/19305.

Full text
Abstract:
Teil I dieser Arbeit untersucht Effekte einer semantischen Eigenschaft von syntaktisch transitiven Prädikaten auf die Satzverarbeitung: den Grad der Affiziertheit, den solch ein Prädikat für eines seiner Argumente impliziert, also das Ausmaß in dem ein Teilnehmer eine Zustandsveränderung während des ausgedrückten Ereignisses erfährt. Drei Experimente (zu Akzeptabilitätsurteilen, Lesezeiten & ereigniskorrelierten Potentialen [EKP]) überprüften den Einfluss von Affiziertheit auf die lexikalisch-semantische Verarbeitung des Prädikats und die Integration folgender Argument-Nominalphrasen (NPn) beim Satzlesen. Um den Effekt von Affiziertheit auf Verarbeitungsprozesse beim Argumentlinking zu maximieren, machten die Experimente Gebrauch von deverbalen, eventiven, deutschen -ung-Nominalisierungen, die von Verben mit verschiedenem Affiziertheitsgrad abgeleitet sind. Hierbei wurde der Einfluss der Affiziertheit ausgenutzt, folgende Objekts- oder Subjektsgenitive als Argument zu lizensieren. Während sich keine klaren Effekte bei der Verarbeitung der Nominalisierungen ergaben, zeigten die Ergebnisse zur Integration der Genitiv-NPn eine spezifische Interaktion zwischen dem Grad der Affiziertheit und der Akzeptabilität, einen Objekts- bzw. Subjektsgenitiv zu verwirklichen. Dieser Befund wurde von übereinstimmenden Lesezeiten- und EKP-Interaktionseffekten begleitet. Die EKP-Ergebnisse legen eine zentrale Rolle von zwei späten Positivierungen bei der Integration der Genitivargumente nahe. Diese Befunde werden im Kontext theoretischer Modelle von abgestufter Affiziertheit und von Satzverarbeitungsmodellen diskutiert, mit Fokus auf der möglichen Rolle der Prototypikalität von Argumenten und Unterschieden zwischen prädikatsinduzierter und argumentinhärenter semantischer Argumentprominenz. Teil II stellt zwei Softwarepakete für die statistische Umgebung R vor, die Schnittstellen für die zur Analyse der EKP-Daten verwendeten WFMM Software (Morris & Carroll, 2006) bieten.<br>Part I of this thesis investigates the effects of a semantic feature of syntactically transitive predicates on online sentence processing: the degree of affectedness such a predicate implies for one of its arguments, taken to indicate the extent to which an event participant undergoes a change of state during the event expressed. Three experiments (using acceptability judgements, reading times & event-related potentials [ERPs]) probed the effects of affectedness on the processing of the predicate and on the integration of following argument noun phrases (NPs), thus assessing the impact of affectedness on lexical-semantic processing and its influence on processes related to online argument linking. Maximising the effects of affectedness on linking-related processes, the experiments used German -ung nominalisations derived from verbs implying different degrees of affectedness, exploiting a paradigm involving the linking of either object or subject genitive argument NPs to deverbal, eventive nominalisations. While no clear effects of affectedness emerged for the processing of the nominalisations, results related to the integration of genitive NPs revealed a specific interaction between the degree of affectedness and the acceptability of realising either object or subject genitives. This pattern was accompanied by consistent reading time and ERP interaction effects. ERP results suggest a prominent role of two late positivities with different spatial foci in the integration of the genitive arguments. These findings are discussed in the context of theoretical models of graded affectedness and sentence processing models, considering possible roles of argument prototypicality and differences between predicate-induced and argument-inherent semantic argument prominence. Part II introduces two software packages for the statistical platform R, offering interfaces for the WFMM software (Morris & Carroll, 2006) used for the analysis of the ERP data in part I.
APA, Harvard, Vancouver, ISO, and other styles
30

Sueur, Maxime. "Characterization of organic aerosols from ship emissions by high resolution mass spectrometry : Development of new analytical methods and data visualization software." Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMR006.

Full text
Abstract:
Les aérosols organiques (OA), qu’ils soient d’origine anthropogène ou naturelle, ont un grand impact sur l’environnement, tant d’un point de vue climatique que sanitaire. Parmi les nombreuses sources d’OA, le secteur du transport maritime occupe une place loin d’être négligeable, d’une part à cause du volume annuel de gaz d’échappement et de leur nature ; d’autre part à cause de la proximité des émissions avec des zones habitées telles que les ports ou les côtes. Même si la composition de ces gaz d’échappement est contrôlée au moyen d’analyses de routine, il est important de l’étudier en profondeur afin d’affiner les réglementations. Cependant, une analyse de la composition à l’échelle moléculaire d’un mélange aussi complexe que les émissions de navire requiert l’utilisation de techniques telle que la spectrométrie de masse (MS), et notamment la spectrométrie de masse à transformée de Fourier (FTMS). Cependant, la FTMS génère une grande quantité de données et nécessite généralement l’utilisation de logiciels de traitement et de visualisation afin d’extraire et de mettre en évidence les informations pertinentes. De plus, bien que la FTMS fournisse des informations sur la composition moléculaire d’un échantillon, cette technique ne permet pas d’en évaluer la diversité isomérique, contrairement à la spectrométrie de mobilité ionique (IMS). Ainsi, afin de caractériser les émissions de navire, nous avons adopté trois axes de travail : le développement d’un logiciel open-access sous python pour faciliter le traitement et la visualisation des données FTMS pour la caractérisation de mélanges complexes, la caractérisation ciblée de pétroporphyrines dans les carburants navals et leur produits de combustion par spectrométrie de masse à résonnance cyclotronique des ions par transformée de Fourier (FTICR) grâce à la désorption/ionisation laser assistée par matrice à transfert d'électrons (ET-MALDI), et l’étude des modification structurelles provoquées par le vieillissement photochimique sur les émissions de navires par IMS-MS<br>Organic aerosols (OA), whether of anthropogenic or natural origin, have a major impact on the environment, both in terms of climate changes and health effects. Among the many sources of OA, the maritime transport sector occupies a far from negligible place, on the one hand, because of the annual volume of exhaust emitted and their nature, and on the other because of the proximity of emissions to inhabited areas such as ports or coastlines. Although the composition of these exhausts is monitored by means of routine analyses, it is important to study them in depth in order to fine-tune regulations on marine traffic. However, the analysis of the molecular-scale composition of a mixture as complex as ship emissions requires the use of techniques such as mass spectrometry (MS), and in particular Fourier transform mass spectrometry (FTMS). However, FTMS generates a large amount of data and generally requires the use of processing and visualization software to extract and highlight relevant information. Also, although FTMS provides information on the molecular composition of a sample, this technique does not allow the evaluation of the isomeric diversity, unlike ion mobility spectrometry (IMS). So, with the aim to characterize ship emissions, we adopted three lines of work: the development of open-access software under Python to facilitate the processing and visualization of FTMS data for the characterization of complex mixtures, the targeted characterization of petroporphyrins in naval fuels and their combustion products by Fourier transform ion cyclotron resonance (FTICR) mass spectrometry using electron transfer matrix-assisted laser desorption/ionization (ET-MALDI), and the study of structural modifications caused by photochemical aging on ship emissions using IMS-MS
APA, Harvard, Vancouver, ISO, and other styles
31

Pfeiffer, Philip Edward. "A System for Determining the Statistical Significance of the Frequency of Short DNA Motif Matches in a Genome - An Analytical Approach." University of Dayton / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1304599225.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Santos, Ricardo de Souza. "Tecnologias digitais na sala de aula para aprendizagem de conceitos de geometria analítica: manipulações no software GrafEq." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2008. http://hdl.handle.net/10183/15880.

Full text
Abstract:
Este estudo aborda a utilização de recursos disponibilizados pelas tecnologias digitais no ensino-aprendizagem de Matemática. Mais especificamente, o objeto de estudo é a introdução do software GrafEq no ensino de Geometria Analítica no Ensino Médio da Escola Básica, com reflexões acerca das contribuições identificadas. Para verificar o alcance destas contribuições, foi implantada uma sequência de atividades em duas turmas do segundo ano do nível médio em uma escola da rede privada de Porto Alegre. A análise dos resultados foi obtida de forma empírica utilizando-se, como método, o Estudo de Caso. Para isso, o estudo foi fundamentado pelas teorias de James J. Kaput sobre introdução das tecnologias digitais na Educação Matemática. Os resultados encontrados apontam para o uso de tecnologias digitais como uma possível contribuição no ensino-aprendizagem de Geometria Analítica, a qual se constitui em um importante tópico de Matemática do Ensino Médio. Como elementos integrantes dessa dissertação foram elaborados um tutorial para uso do programa, na forma de páginas para web (linguagem html), e um conjunto de atividades envolvendo tópicos de Geometria Analítica e uso do software.<br>This study approaches the use of available resources by digital technologies in teaching-learning Mathematics. Specifically, the object of study is the introduction of the software GrafEq in teaching Analytical Geometry in High School in Secondary Education, with considerations about the identified contributions. To check the reach of these contributions, a sequence of activities was introduced in two Second Grade classes from Secondary Education in a private school in Porto Alegre. The analysis of the results was obtained by empirical form when the Case Study was used as method. For that, the study was substantiated by James J. Kaput theories on introduction of the digital technologies in the Mathematical Education. The considered results point to the use of digital technologies as a possible contribution in the teaching-learning of Analytical Geometry, which constitute an important topic of Mathematics in Secondary Education. As integral elements of that dissertation were elaborated a tutorial for using the program, in the form of web pages (html language), and a group of activities involving topics of Analytical Geometry and use of the software.<br>Este estudio aborda la utilización de recursos disponibilizados por las tecnologías digitales en la enseñanza-aprendizaje de la Matemática. Más específicamente, el objeto de estudio es la introducción del software GrafEq en la enseñanza de la Geometría Analítica en la Enseñanza Secundaria de la Escuela Básica, con reflexiones acerca de las contribuciones identificadas. Para verificar el alcance de estas contribuciones, fue implantada una secuencia de actividades en dos grupos del cuarto año de la enseñanza secundaria en una escuela de la red privada de Porto Alegre. El análisis de los resultados fue obtenido de forma empírica utilizándose, como método, el Estudio de Caso. Para ello, el estudio fue fundamentado por las teorías de James J. Kaput sobre introducción de las tecnologías digitales en la Educación Matemática. Los resultados encontrados apuntan para el uso de tecnologías digitales como una posible contribución en la enseñanza-aprendizaje de la Geometría Analítica, la cual se constituye en un importante tópico de la Matemática de la enseñanza secundaria. Como elementos integrantes de esa disertación fueron elaborados una tutoría para el uso del programa, en forma de páginas web ( lenguaje html), y un conjunto de actividades envolviendo tópicos de Geometría Analítica y uso del software.
APA, Harvard, Vancouver, ISO, and other styles
33

Karthikeyan, Arun Kumar, and Praveen Kumar Mani. "Visual and Analytical Support for Real-time Evaluation of Railway Traffic Re-scheduling Alternatives During Disturbances." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4299.

Full text
Abstract:
Disturbances in the railway network are frequent and to some extent, inevitable. When this happens, the traffic dispatchers need to re-schedule the train traffic and there is a need for decision support in this process. One purpose of such a decision support system would be to visualize the relevant, alternative re-scheduling solutions and benchmark them based on a set of relevant train traffic attributes which quantify the effects of each solution. Currently, there are two research projects financed by the Swedish Transport Administration (i.e. Trafikverket) which focus on developing decision support to assist the Swedish train traffic managers: The STEG project and the EOT project. Within the STEG project, researchers at Uppsala University in co-operation with Trafikverket are developing a graphical user interface (referred to as the STEG graph). Within the EOT project, researchers at Blekinge Institute of Technology (BTH) are developing fast re-scheduling algorithms to propose to the Swedish train traffic dispatchers a set of relevant re-scheduling alternatives when disturbances occur. However, neither the STEG graph nor the EOT algorithms are at this point designed to evaluate, benchmark and visualize the alternative re-scheduling solutions. The main objective of this work is therefore to identify and analyze different train traffic attributes and how to use the selected relevant ones for benchmarking re-scheduling solutions. This involves enhancing an existing visual tool (EOT GUI) and using this extended version (referred to as the EOT GUI+) to demonstrate and evaluate the benchmarking of different re-scheduling solutions based on the selected train traffic attributes. The train traffic attributes found in the literature (foremost research publications and documents by Trafikverket) were collected and analyzed. A subset of the most commonly used attributes found were then selected and their applicability in benchmarking re-scheduling solutions for the Swedish train traffic system was further analyzed. The formulas for calculating each of the attribute values were either found in the literature and possibly modified, or defined within this thesis project. In order to assess the use of the attributes for benchmark solutions, experiments were conducted using the enhanced visual tool EOT GUI+ and a set of sample solutions for three different disturbance scenarios provided by the EOT project. The tool only performs a benchmark of two solutions at a time (i.e. a pair wise benchmark) and computes the attribute values for the chosen attributes. The literature review and attribute analysis resulted in a first set of ten different attributes to use including e.g. total final delay (with a delay threshold value of 1 and 5 minutes respectively), maximum delay, total accumulated delay, total delay cost, number of delayed trains and robustness. The formulas to compute these attribute values were implemented and applied to the sample solutions in the experiments. The first phase of the experiments showed that in one of the disturbance scenarios, some of the attribute values were in conflict and that none of re-scheduling solution was dominating the others. This observation led to that the set of attributes needed to be narrowed down and internally prioritized. Based on the experimental results and the analysis of what the research community and the main stakeholder (i.e. Trafikverket) consider are the most important attributes in this context, the final set of attributes to use includes average final delay, maximum delay of a single train, total number of delayed trains and robustness. The contribution of this thesis is primarily the review and analysis of what attributes to use when performing a benchmark of re-scheduling solutions in real-time train traffic disturbance management. Furthermore, this thesis also contributes by performing an experimental assessment of how the attributes and their formulas could work in a pair-wise, quantitative benchmark for a set of disturbance scenarios and which issues that may occur due to conflicting objectives and attribute values. Concerning the enhancement of the visual tool and the visualization of the re-scheduling solutions, the experimental evaluation and analysis shows that the tool would not fit directly to the needs of the train dispatchers. This work should therefore only be seen as a starting point for the researchers whom are working with the development of decision support systems in this context. Furthermore, several iterative experiments have been conducted to select the appropriate attributes for benchmarking solutions and suggesting the best re-scheduling solution. During the experiments, we have used a limited set of different problem instances (2+2+7) representing three different types of disturbances. The performance of the enhanced visual tool EOT GUI+ and its functionalities should ideally also be analyzed further and improved by experimenting with a larger number of instances, for other parts of the Swedish railway network and in co-operation with the real users, i.e. the dispatchers.
APA, Harvard, Vancouver, ISO, and other styles
34

Warth, Benedikt. "Design and Application of Software Sensors in Batch and Fed-batch Cultivations during Recombinant Protein Expression in Escherichia coli." Thesis, Linköping University, The Department of Physics, Chemistry and Biology, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-12530.

Full text
Abstract:
<p>Software sensors are a potent tool to improve biotechnological real time process monitoring and control. In the current project, algorithms for six partly novel, software sensors were established and tested in a microbial reactor system. Eight batch and two fed-batch runs were carried out with a recombinant <em>Escherichia coli</em> to investigate the suitability of the different software sensor models in diverse cultivation stages. Special respect was given to effects on the sensors after recombinant protein expression was initiated by addition of an inducer molecule. It was an objective to figure out influences of excessive recombinant protein expression on the software sensor signals.</p><p>Two of the developed algorithms calculated the biomass on-line and estimated furthermore, the specific growth rate by integration of the biomass changes with the time. The principle of the first was the application of a near infrared probe to obtain on-line readings of the optical density. The other algorithm was founded on the titration of ammonia as only available nitrogen source. The other two sensors analyzed for the specific consumption of glucose and the specific production of acetate and are predicted on an in-line HPLC system.</p><p>The results showed that all software sensors worked as expected and are rather powerful to estimate important state parameters in real time. In some stages, restrictions may occur due to different limitation affects in the models or the physiology of the culture. However, the results were very convincing and suggested the development of further and more advanced software sensor models in the future.</p>
APA, Harvard, Vancouver, ISO, and other styles
35

Rezai, Arash. "Evaluation of development methods for mobile applications : Soundhailer’s site and iOS application." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-191124.

Full text
Abstract:
To remain competitive and successful in today’s globalized market, companies need a strategy to ensure that they are constantly at the leading edge in terms of products and services. The implementation of a mobile application is one approach to fulfill this requirement. This report describes an overview of the topic, by introducing briefly today’s development tools for mobile application development and subsequently focusing on the Soundhailer application, as the application done by the author. The problem in focus is to find out whether a native or web-based application is preferred for an iOS application production strategy for a start-up company. Moreover, the report delivers an insight into a well-structured method that works good for setting up measuring points for a website, also Soundhailer’s, and the factual realization of a development tool for iOS development. This insight is based on a lot of help from a former student of the Royal Institute of Technology, who has had some previous experience within the area. To show prospective similarities and differences between theory and reality, the experiences are subsequently compared to the theoretical part. Finally, the results are critically discussed. Two versions of the application were developed, both a native version and a web-based version, and the results show that both native and web-based applications can be convenient solutions for companies to implement and use. The results also provide a foundation upon which others can build and better understand how an iOS application is used and developed.<br>För att förbli konkurrenskraftiga och framgångsrika i dagens globaliserade marknad, behöver företagen en strategi för att se till att de ständigt är i framkant när det gäller produkter och tjänster. Att framställa en mobilapplikation är ett av många sätt för att nå upp till detta krav. Denna rapport ger en överblick över ämnet genom att först gå igenom dagens utvecklingsverktyg för mobilapplikationer och därefter fokusera på företaget Soundhailers mobilapplikation, eftersom denne har utvecklats av undertecknad. Problemet i fokus består av att ta reda på om en hårdvarukodad eller webbaserad applikation är att föredra för produktionsstrategin av en iOSapplikation för ett start-up-företag. Dessutom ger rapporten en inblick i en välstrukturerad metod som fungerar bra för att inrätta mätpunkter för en webbplats, med fokus på Soundhailers webbplats, samt det faktiska genomförandet av ett utvecklingsverktyg för iOS-utveckling. Denna insikt bygger på en hel del hjälp från en före detta elev på Kungliga Tekniska Högskolan som har tidigare erfarenheter inom området. För att sedan visa potentiella likheter och skillnader mellan teori och verklighet jämförs erfarenheterna med den teoretiska delen. Slutligen diskuteras resultaten kritiskt. Två versioner av applikationen har utvecklats, både en hårdvarukodad version och en webbaserad version, och resultaten visar att både hårdvarukodade och webbaserade applikationer kan vara praktiska lösningar som företag kan implementera och använda sig av. Resultaten ger också en grund på vilken andra kan bygga vidare på samt en bättre förståelse för hur en iOSapplikation kan användas och utvecklas
APA, Harvard, Vancouver, ISO, and other styles
36

Sahni, Deepak. "A Controlled Experiment on Analytical Hierarchy Process and Cumulative Voting-Investigating Time, Scalability, Accuracy, Ease of Use and Ease of Learning." Thesis, Blekinge Tekniska Högskola, Avdelningen för programvarusystem, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3224.

Full text
Abstract:
Prioritizing software requirements helps to determine which requirements that are most important,and in which order requirements should be developed and tested throughout the development lifecycle. By prioritizing the requirements, software engineers can put focus on a subset of all requirements, and implement these in a particular release. This thesis aims to analyze two ratio scale prioritization techniques in a controlled experiment.The experiment was designed to compare time consumption, scalability, accuracy, ease of use, and ease to learning between the two techniques. All these evaluation parameters are combined together to find out which technique is more suitable to use when prioritizing software requirements. The two techniques investigated in the study presented in this thesis are: Analytical Hierarchy Process (AHP), which is based on pair-wise comparisons; and Cumulative Voting (CV) which is based on distributing points between requirements. The results of the experiment indicate that CV is less time consuming than AHP, which makes it more scalable. Further, CV is regarded as easier to use, and easier to learn than AHP. In addition, the results show that CV is more accurate than AHP while measuring the perceived accuracy. However, no conclusions could be made regarding actual accuracy due to misinterpretations by the study participants.
APA, Harvard, Vancouver, ISO, and other styles
37

Košata, Václav. "Srovnání řešení BI na bázi SaaS." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-114272.

Full text
Abstract:
The diploma thesis is focused on a specific way of distribution of Business Intelligence application on Software-as-a-Service base. A different concept opens a possibility for small and medium-size companies which cannot afford robust and expensive solution. The theoretical part provides an introduction with the basic characteristics of BI systems and cloud applications. Additionally, descriptions of the selected criteria are stated for a comparison of the specifics of applications delivered as a service. Integration, analytical and reporting functions Belladati, Zoho Reports and Bime are tested in a practical part of the thesis. The main chapter is devoted to solution product comparison, based on the selected criteria. The main asset of the work is to discover the strengths and weaknesses of each solutions found during the practical testing on the test data. The result of the comparison is not to find the best product, but to enhance the specific properties. The output can serve as a background material during cloud-based BI applications selection.
APA, Harvard, Vancouver, ISO, and other styles
38

ELFAHL, Mustafa. "Validating IRRILAB and IRRIPRO software applications to design microirrigation systems in an apple farm in Sicily." Doctoral thesis, Università degli Studi di Palermo, 2022. http://hdl.handle.net/10447/564160.

Full text
Abstract:
The study of microirrigation hydraulic design has been widely addressed over many decades to enhance on-farm water use efficiency and water distribution uniformity with minimal impact on energy consumption through introducing innovative engineering solutions for better control of the system. The thesis first reviewed the literature, providing an overview of the microirrigation system, its performance, and its hydraulic design procedures are provided along with a review of some approaches that were developed for the hydraulic design, and examine their assumptions and theories. Among this literature are IRRIPRO and IRRILAB software, which were developed among the research activities carried out at the University of Palermo (Italy). IRRILAB software is based on analytical solutions and allows saving energy, but requires a rectangular shape of the sector defined by two uniform slopes one for the laterals and one for the manifold, whereas IRRIPRO finds numerical solutions that required a lot of attempts but can be applied on any shape of the sectors. These two software may offer a unique solution in designing microirrigation units through capturing the positive aspects of both and in the meanwhile overcoming the negative ones. This thesis investigates the performance of IRRILAB and IRRIPRO software applications, using pressure-compensating-emitters and not-pressure-compensating ones, to design microirrigation systems in an apple farm in Sicily, which is characterized by a high irregular topography, thus it is suitable for the purpose of this study. Several investigations were carried out through both theoretical verification and experimental validation in the field. To begin with, the theoretical study to verify the performance of IRRILAB software application in designing a large number of microirrigation sectors characterized by different irregularities degree in sector's slope and planform geometry was carried out by using the design parameters of IRRILAB as input in the IRRIPRO software that is able to show the pressure head distribution maps for each sector. The results of this study have shown that the IRRILAB software could be recommended because it is easy to use, making it possible to save energy, especially when sectors are almost rectangular and uniform in slopes. In addition, a further investigation was carried out for IRRILAB software by performing an experimental study for only one sector extended to 7,369 m2. Without considering the minor losses caused due to the presence of emitters along the laterals, the microirrigation sector for this experiment was designed as recommended by IRRILAB software using a not-pressure-compensating emitter, aimed at validating that the measured emitter’s flow rates fall within the corresponding limits of IRRILAB acceptability. Results showed that for some emitters, flow rates fall below the minimum admitted value. This was ascribed to the irregularity of the sector planform geometry, not exactly rectangular as IRRILAB requires, as well as the effect of minor losses that were neglected in the design. Subsequently, the latter issue was then investigated by using a new methodology introduced recently that makes it possible to quantify the amount of minor losses in terms of the equivalent length. Based on the calculation, a new experiment different from the previous one that neglects minor losses to a new one in which minor losses are considered was performed. Results have shown that a certain improvement of the emitters' flow rates was obtained, referring to the important role of the minor losses that are playing in microirrigation systems design. This study concluded that the recently introduced procedure to account for minor losses could be successfully implemented in IRRILAB to improve microirrigation systems design. In conclusion, an experimental investigation study is carried out to assess the performance of the IRRIPRO software in simulating the field situation of microirrigation systems under different operating pressures. This study is based on field measurements of emitter flow rates aimed at validating that the measured emitter’s flow rates in the field match the results obtained through IRRIPRO simulations. Different statistical analyses were performed in order to show the accuracy of IRRIPRO software prediction for the emitters' flow rates. The experiment results showed how the use of IRRIPRO software allows representing the field situation, thereby, can be used to improve the performance of any microirrigation system, provided that a similar design layout is applied.
APA, Harvard, Vancouver, ISO, and other styles
39

Ekelund, Emil. "LactateStat: Wearable Electronics and Software for Real-Time Lactate Monitoring in Sweat." Thesis, KTH, Medicinteknik och hälsosystem, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-297538.

Full text
Abstract:
Lactate is an important biomarker in sports and the lactate threshold concept is one of the best indicators of endurance performance in an athlete. However, to quantify the lactate threshold, an invasive method to take a blood sample must be used. Limitations for this method include stopping the active exercise for blood sampling and no real-time feedback. Instead, a novel non-invasive wearable biosensor can be used to measure the lactate concentration in sweat. The sensor generates an electrical current depending on the lactate concentration in the sweat, and therefore must an electronic device called a potentiostat be used to condition the current. However, available potentiostats are not suitable for use in sports where form factor, battery life and wireless connectivity is important. This thesis aims to solve this by developing a wearable device and software which can be used to measure the lactate concentration in sweat in real-time during exercise. The development process consisted of the determination of specifications, prototype development and thorough laboratory and on-body testing. Finally, a novel wearable device and software capable of real-time lactate measurements in sweat during exercise were presented. The device, called LactateStat, was 58mm ∗ 55mm ∗ 13mm, the current consumption was only 7.8mA, the current measurement resolution was 0.5 nA, the limit of detection was 0.45 nA and the current measurement range was around 750 μA. LactateStat is one of the first of its kind and provides a base for future development as the hardware, firmware and software resources are provided as open-source.<br>Laktat är en viktig biomarkör inom idrott och konceptet laktattröskel är en av de bästa indikatorerna för uthållighet hos en idrottsutövare. Men för att mäta laktattröskeln krävs en invasiv metod som går ut på att man tar ett blodprov. Begränsningarna med denna metod är bland annat att idrottsutövningen måste stoppas för att ta ett blodprov och att man inte får realtidsåterkoppling. I stället kan en liten och icke-invasiv bärbar biosensor användas för att mäta laktatkoncentrationen i svett. Sensorn genererar en elektrisk ström beroende på laktatkoncentrationen i svetten och en enhet som kallas potentiostat måste därför användas för att mäta denna ström. Tillgängliga potentiostater är dock inte lämpliga för användning inom sport där formfaktor, batteritid och trådlös kommunikation är viktigt. Denna avhandling har som syfte att lösa detta genom att utveckla en bärbar enhet och mjukvara som kan användas för att mäta laktatkoncentrationen i svett i realtid under träning. Utvecklingsprocessen bestod av bestämning av specifikationer, framtagning av en prototyp och noggrann laboratorie- och kroppstestning. Som resultat presenterades en banbrytande bärbar enhet och mjukvara som kan mäta laktat i realtid i svett under träning. Enheten, som kallades LactateStat, var 58mm * 55mm * 13mm, den aktiva strömförbrukningen var 7.8mA, den bästa mätupplösningen för ström var 0.5 nA, detektionsgränsen var 0.45 nA och det maximala mätområdet för ström var ca. 750 μA. LactateStat är en av de första i sitt slag och ger en bas för framtida utvecklingsarbete eftersom hårdvaru-, programvaru- och mjukvaruresurserna tillhandahålls som öppen källkod.
APA, Harvard, Vancouver, ISO, and other styles
40

Iftikhar, Muhammad Usman. "A Model-Based Approach to Engineer Self-Adaptive Systems with Guarantees." Doctoral thesis, Linnéuniversitetet, Institutionen för datavetenskap, fysik och matematik, DFM, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-69136.

Full text
Abstract:
Modern software systems are increasingly characterized by uncertainties in the operating context and user requirements. These uncertainties are difficult to predict at design time. Achieving the quality goals of such systems depends on the ability of the software to deal with these uncertainties at runtime. A self-adaptive system employs a feedback loop to continuously monitor and adapt itself to achieve particular quality goals (i.e., adaptation goals) regardless of uncertainties. Current research applies formal techniques to provide guarantees for adaptation goals, typically using exhaustive verification techniques. Although these techniques offer strong guarantees for the goals, they suffer from well-known state explosion problem. In this thesis, we take a broader perspective and focus on two types of guarantees: (1) functional correctness of the feedback loop, and (2) guaranteeing the adaptation goals in an efficient manner. To that end, we present ActivFORMS (Active FORmal Models for Self-adaptation), a formally founded model-driven approach for engineering self-adaptive systems with guarantees. ActivFORMS achieves functional correctness by direct execution of formally verified models of the feedback loop using a reusable virtual machine. To efficiently provide guarantees for the adaptation goals with a required level of confidence, ActivFORMS applies statistical model checking at runtime. ActivFORMS supports on the fly changes of adaptation goals and updates of the verified feedback loop models that meet the changed goals. To demonstrate the applicability and effectiveness of the approach, we applied ActivFORMS in several domains: warehouse transportation, oceanic surveillance, tele assistance, and IoT building security monitoring.<br>Marie Curie CIG, FP7-PEOPLE-2011-CIG, Project ID: 303791
APA, Harvard, Vancouver, ISO, and other styles
41

Júnior, Alberto de Medeiros. "Sistemas integrados de gestão: proposta para um procedimento de decisão multicritérios para avaliação estratégica." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/3/3136/tde-02062008-142434/.

Full text
Abstract:
Os Sistemas Integrados de Gestão, também conhecidos como ERP (Enterprise Re-source Planning), vem tendo ampla utilização nas organizações a partir dos anos 90. Por exigir um investimento de elevado valor financeiro para a sua implantação, os responsáveis pela sua aquisição devem tomar cuidados especiais, uma vez que os seus resultados positivos ou negativos somente surgem após longo período de im-plantação, às vezes após muitos anos. Sendo um problema complexo, repleto de incertezas e riscos, os decisores tomam muito de seu tempo para analisar os diver-sos critérios e funcionalidades das ofertas de sistemas recebidas. A tese objetiva apresentar um procedimento que possibilite às empresas, em particular as de pe-queno e médio porte, um procedimento que as permita analisar quando do interesse da aquisição de um ERP, qual das ofertas disponíveis estará mais adequada às su-as necessidades de negócio, baseado em um método multicritérios de apoio à deci-são. A revisão da literatura analisa os Sistemas de Informação (SI) informatizados e os principais papéis desempenhados por eles: apoio às operações, apoio à vanta-gem competitiva e apoio à decisão. A seleção das ofertas propostas foi efetuada uti-lizando o método de Estudo de Casos múltiplos em empresas que adquiriram esses sistemas,ghy utilizando o ANP (Analytic Network Process) como instrumento de pesquisa. Para se estabelecer uma classificação dos critérios utilizados na análise foi utilizada a Técnica Delphi, a qual foi realizada junto a especialistas em Tecnologia de Informação. O resultado obtido pelo Estudo de Casos mostrou que o procedimen-to proposto é válido e pode ser utilizado por empresas de todos os portes.<br>The use of Integrated Management Systems, also known as ERP (Enterprise Re-source Planning), are widely accepted by organizations since beginning of the ni-neties. As its implementation means a high financial value investment, the respon-sible team for its acquisition has to take special cares, once their positive or nega-tive results will appear only after long implementation period, often after many years. As it is a complex decision problem, evolving uncertainties and risks, the decision agents spend a lot of time to analyze the several criteria and functional-ities from received offers. This thesis presents a proposal which makes possible the companies, particularly those of small or medium sizes, which allows to analyze during the ERP acquisition phase, the available offers more adapted to their business requirements, based on a multi-criteria support decision method. The literature revision analyzes the computerized Information Systems (IS) and the main roles carried out by them: operations support, competitive advantage support and decision support. In order to define the criteria set used in the multi-criteria analysis, the Delphi Method was used and it was answered by Information Technology experts. These criteria was used to classify the ERP\'s offers using the multiple cases study using ANP (Analytic Network Process) as research tool.. The results obtained by case study in four companies were used to validate several propositions.
APA, Harvard, Vancouver, ISO, and other styles
42

Stephanos, Dembe. "Machine Learning Approaches to Dribble Hand-off Action Classification with SportVU NBA Player Coordinate Data." Digital Commons @ East Tennessee State University, 2021. https://dc.etsu.edu/etd/3908.

Full text
Abstract:
Recently, strategies of National Basketball Association teams have evolved with the skillsets of players and the emergence of advanced analytics. One of the most effective actions in dynamic offensive strategies in basketball is the dribble hand-off (DHO). This thesis proposes an architecture for a classification pipeline for detecting DHOs in an accurate and automated manner. This pipeline consists of a combination of player tracking data and event labels, a rule set to identify candidate actions, manually reviewing game recordings to label the candidates, and embedding player trajectories into hexbin cell paths before passing the completed training set to the classification models. This resulting training set is examined using the information gain from extracted and engineered features and the effectiveness of various machine learning algorithms. Finally, we provide a comprehensive accuracy evaluation of the classification models to compare various machine learning algorithms and highlight their subtle differences in this problem domain.
APA, Harvard, Vancouver, ISO, and other styles
43

Ludwig, Lars [Verfasser], Hans-Görg [Akademischer Betreuer] Roos, and Gunar [Akademischer Betreuer] Matthies. "Analytical investigations and numerical experiments for singularly perturbed convection-diffusion problems with layers and singularities using a newly developed FE-software / Lars Ludwig. Gutachter: Hans-Görg Roos ; Gunar Matthies. Betreuer: Hans-Görg Roos." Dresden : Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://d-nb.info/1068445858/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Dasgupta, Aniruddha. "CUDA performance analyzer." Thesis, Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/39555.

Full text
Abstract:
GPGPU Computing using CUDA is rapidly gaining ground today. GPGPU has been brought to the masses through the ease of use of CUDA and ubiquity of graphics cards supporting the same. Although CUDA has a low learning curve for programmers familiar with standard programming languages like C, extracting optimum performance from it, through optimizations and hand tuning is not a trivial task. This is because, in case of GPGPU, an optimization strategy rarely affects the functioning in an isolated manner. Many optimizations affect different aspects for better or worse, establishing a tradeoff situation between them, which needs to be carefully handled to achieve good performance. Thus optimizing an application for CUDA is tough and the performance gain might not be commensurate to the coding effort put in. I propose to simplify the process of optimizing CUDA programs using a CUDA Performance Analyzer. The analyzer is based on analytical modeling of CUDA compatible GPUs. The model characterizes the different aspects of GPU compute unified architecture and can make prediction about expected performance of a CUDA program. It would also give an insight into the performance bottlenecks of the CUDA implementation. This would hint towards, what optimizations need to be applied to improve performance. Based on the model, one would also be able to make a prediction about the performance of the application if the optimizations are applied to the CUDA implementation. This enables a CUDA programmer to test out different optimization strategies without putting in a lot of coding effort.
APA, Harvard, Vancouver, ISO, and other styles
45

Silva, Raquel Santos. "Estudo da reta em geometria analítica: uma proposta de atividades para o ensino médio a partir de conversões de registros de representação semiótica com o uso do software GeoGebra." Pontifícia Universidade Católica de São Paulo, 2014. https://tede2.pucsp.br/handle/handle/10990.

Full text
Abstract:
Made available in DSpace on 2016-04-27T16:57:30Z (GMT). No. of bitstreams: 1 Raquel Santos Silva.pdf: 4380772 bytes, checksum: cab9932dcf76a86c4111d87864a8a3ac (MD5) Previous issue date: 2014-03-11<br>This article makes part of the research project for Professional Master´s degree in Mathematical Education from PUC-SP. It has the aim to investigate if the using of a dynamic geometry software, GeoGebra, could contribute for a better comprehension of the mathematical object, the line. They were applied some aspects studied in the Analytical Geometry during the 3rd year of high school (in Brazil). For verifying these aspects, they were used the ideas of Raymond Duval in the Semiotic Registers of Representation theory. A sequence of activities that focuses on the line was built based on the theory. Its different representation forms and the coordination among the algebraic, graphic and natural language registers were considered in addition. For the development of the activities, the students used the software GeoGebra as a support. The students studied in a public school located at the south part of São Paulo city were the target. The methodology used was the Didactics Engineering of Michélle Artigue. The results show off that the use of this software may contribute to the mathematical object apprehension, the line, to facilitate and speed up its study<br>Este trabalho tem por objetivo investigar se a utilização de um software de geometria dinâmica, o GeoGebra, pode contribuir a partir dos pontos de vista cognitivo e matemático para uma melhor compreensão do objeto matemático reta em relação a geometria analítica, na 3ª série do Ensino Médio. Para essa verificação foram utilizadas as ideias de Raymond Duval na teoria dos Registros de Representação Semiótica. Com base nessas ideias foi construída uma sequência composta por 4 atividades cujo foco principal é o estudo da reta, suas diferentes formas de representação e a coordenação entre os registros algébrico, gráfico e da língua natural. Para o desenvolvimento das atividades os alunos utilizaram como apoio o software de geometria dinâmica GeoGebra. Esta sequência foi aplicada a alunos de uma escola pública estadual. A metodologia utilizada foi a Engenharia Didática de Michèle Artigue. Os resultados apresentados sinalizam que a utilização do software pode contribuir para a apreensão do objeto matemático reta de modo a facilitar e acelerar o seu estudo
APA, Harvard, Vancouver, ISO, and other styles
46

Novotný, Roman. "Rozšíření metodiky MMSP v oblasti analýzy a návrhu testování." Master's thesis, Vysoká škola ekonomická v Praze, 2015. http://www.nusl.cz/ntk/nusl-201679.

Full text
Abstract:
The thesis "Extending the methodology MMSP in test analysis and design" characterizes the discipline of test analysis and design carried out by its inclusion in the Methodology for small software projects (MMSP). The theoretical part defines the concept of discipline Test analysis and design and characterizes its role in the context of software testing and use in developing software products. It also analyzes attitude of commonly known models and methodologies of software development, certification, standards and norms in testing methodology including MMSP to the analytical activities in the field of testing. The practical part describes the methodology MMSP extension by discipline Test analysis and design, which allows the use of this discipline even on small projects developing uncritical IS/ICT solutions, which previously was not possible. Discipline Test analysis and design expands MMSP at considerable amount of analytical activities in the area of testing. Extending the methodology MMSP includes the introduction of new role Test analyst, modification of existing roles, the introduction of new products and descriptions of the recommended testing techniques and test design.
APA, Harvard, Vancouver, ISO, and other styles
47

Jakubičková, Nela. "Návrh metodiky testování BI řešení." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-114404.

Full text
Abstract:
This thesis deals with Business Intelligence and its testing. It seeks to highlight the differences from the classical software testing and finally design a methodology for BI solutions testing that could be used in practice on real projects of BI companies. The aim of thesis is to design a methodology for BI solutions testing based on theoretical knowledge of Business Intelligence and software testing with an emphasis on the specific BI characteristics and requirements and also in accordance with Clever Decision's requirements and test it in practice on a real project in this company. The paper is written up on the basis of studying literature in the field of Business Intelligence and software testing from Czech and foreign sources as well as on the recommendations and experience of Clever Decision's employees. It is one of the few if not the first sources dealing with methodology for BI solutions testing in the Czech language. This work could also serve as a basis for more comprehensive methodologies of BI solutions testing. The thesis can be divided into theoretical and practical part. The theoretical part tries to explain the purpose of Business Intelligence use in enterprises. It elucidates particular components of the BI solution, then the actual software testing, various types of tests, with emphasis on the differences and specificities of Business Intelligence. The theoretical part is followed by designed methodology for BI solutions using a generic model for the BI/DW solution testing. The practical part's highlight is the description of real BI project testing in Clever Decision according to the designed methodology.
APA, Harvard, Vancouver, ISO, and other styles
48

Fridell, Gustav, and Chafjiri Saam Cedighi. "IT’S IN THE DATA : A multimethod study on how SaaS-businesses can utilize cohort analysis to improve marketing decision-making." Thesis, Linköpings universitet, Institutionen för ekonomisk och industriell utveckling, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-167620.

Full text
Abstract:
Incorporating data and analytics within marketing decision-making is today crucial for a company’s success. This holds true especially for SaaS-businesses due to having a subscription-based pricing model dependent on good service retention for long- term viability and profitability. Efficiently incorporating data and analytics does have its prerequisites but can for SaaS-businesses be achieved using the analytical framework of cohort analysis, which utilizes subscription data to obtain actionable insights on customer behavior and retention patterns. Consequently, to expand upon the understanding of how SaaS-businesses can utilize data-driven methodologies to improve their operations, this study has examined how SaaS-businesses can utilize cohort analysis to improve marketing decision-making and what the prerequisites are for efficiently doing so. Thus, by utilizing a multimethodology approach consisting of action research and a single caste study on the fast-growing SaaS-company GetAccept, the study has concluded that the incorporation and utilization of cohort analysis can improve marketing decision-making for SaaS-businesses. This conclusion is drawn by having identified that: The incorporation of cohort analysis can streamline the marketing decision-making process; and The incorporation of cohort analysis can enable decision-makers to obtain a better foundation of information to base marketing decisions upon, thus leading to an improved expected outcome of the decisions. Furthermore, to enable efficient data-driven marketing decision-making and effectively utilize methods such as cohort analysis, the study has concluded that SaaS- businesses need to fulfill three prerequisites, which have been identified to be: Management that support and advocate for data and analytics; A company culture built upon information sharing and evidence-based decision-making; and A large enough customer base to allow for determining similarities within and differences between customer segments as significant. However, the last prerequisite applies specifically for methods such as or similar to cohort analysis. Thus, by utilizing other methods, SaaS-businesses might still be able to efficiently utilize data-driven marketing decision-making, as long as the first two prerequisites are fulfilled.
APA, Harvard, Vancouver, ISO, and other styles
49

BORGES, Fabrício Batista. "Descrição da secagem convectiva de grãos de milho através de modelos difusivos." Universidade Federal de Campina Grande, 2016. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/975.

Full text
Abstract:
Submitted by Emanuel Varela Cardoso (emanuel.varela@ufcg.edu.br) on 2018-06-13T20:31:59Z No. of bitstreams: 1 FABRÍCIO BATISTA BORGES – TESE (PPGEP) 2016.pdf: 4027586 bytes, checksum: c02ca04ae0983633d70b155cc190bf48 (MD5)<br>Made available in DSpace on 2018-06-13T20:31:59Z (GMT). No. of bitstreams: 1 FABRÍCIO BATISTA BORGES – TESE (PPGEP) 2016.pdf: 4027586 bytes, checksum: c02ca04ae0983633d70b155cc190bf48 (MD5) Previous issue date: 2016-09-23<br>Objetiva-se com este trabalho realizar estudos usando as soluções analítica e numérica da equação de difusão para descrever a secagem em camada fina de grãos de milho nas temperaturas de 45, 55, 65 e 75 ºC, os quais têm uma forma que pode ser aproximada a de um paralelepípedo. Foram utilizadas ferramentas numéricas e analíticas para o estudo dos fenômenos difusivos envolvendo as três dimensões geométricas dos grãos mencionados. Para a solução numérica proposta, a equação de difusão tridimensional foi discretizada utilizando os métodos dos volumes finitos, com uma formulação totalmente implícita, fazendo uso das coordenadas cartesianas. Com a finalidade de estabelecer a correta condição de contorno na descrição da cinética de secagem dos grãos de milho, foram utilizados três softwares para a determinação dos parâmetros de processo a partir dos dados experimentais via otimização. O primeiro software usado para simular a cinética de secagem do milho foi “Prescribed Adsorption – Desorption” V.2.2, que utiliza a solução analítica da equação de difusão empregando a condição de contorno do primeiro tipo. O segundo software utilizado foi o “Convective Adsorption – Desorption” V. 2.4. Este simula a cinética de secagem de produtos usando a solução analítica da equação de difusão empregando a condição de contorno do terceiro tipo. Já o terceiro software usado na otimização foi o “LS Optimizer” V.2.1, que determina os parâmetros de uma equação diferencial usando o método dos mínimos quadrados na solução numérica da equação de difusão com a condição de contorno do terceiro tipo. Esses dois últimos geraram resultados coerentes e consistentes em todas as etapas efetuadas durante os testes. Pode-se concluir que o segundo e o terceiro modelos propostos para analisar os dados da pesquisa foram coerentes e equivalentes, e os resultados obtidos foram satisfatórios. Assim, a condição de contorno do terceiro tipo foi usada para a solução numérica tridimensional da equação de difusão em coordenadas cartesianas. Testes de validação indicaram que a solução numérica desenvolvida apresenta resultados coerentes com aqueles esperados.<br>The objective with this essay is to carry out studies using analytical and numerical solutions of the diffusion equation to describe the drying of thin layer of corn grains at temperatures of 45, 55, 65 and 75 ° C, which have a shape that can be approximated for a parallelepiped. Numerical and analytical tools were used for the study of the diffusive phenomena involving the three geometric dimensions of the mentioned grains. For the numerical solution proposed, the three dimensional diffusion equation was discretized using the finite volume method with a fully implicit formulation using the cartesian coordinates. In order to establish the right boundary condition in the description of the corn grain drying kinetics, three software were used for the determination of the process parameters from the experimental data using optimization. The first software used to simulate corn drying kinetics was "Prescribed Adsorption - Desorption" V.2.2, which uses the analytical solution of the diffusion equation using the boundary condition of the first type. The second software used was "Convective Adsorption - Desorption" V. 2.4. This one simulates the kinetics of drying products using the analytical solution of the diffusion equation using the boundary condition of the third kind. The third software used in the optimization was the "LS Optimizer" V.2.1 which determines the parameters of a differential equation using the method of least squares in the numerical solution of the diffusion equation using the boundary condition of the third kind.The latter two have generated coherent and consistent results in all steps performed during testing. It can be concluded that the second and third proposed models to analyze the survey data were consistent and equivalent and the results were satisfactory. Thus, the boundary condition of the third kind was used for three -dimensional numerical solution of the diffusion equation in cartesian coordinates. Validation tests indicated that the developed numerical solution provides coherent results with t hose expected.
APA, Harvard, Vancouver, ISO, and other styles
50

Заковоротный, Александр Юрьевич. "Синтез автоматизированной системы управления подвижным составом на основе геометрической теории управления и нейронных сетей". Thesis, НТУ "ХПИ", 2017. http://repository.kpi.kharkov.ua/handle/KhPI-Press/28330.

Full text
Abstract:
Диссертация на соискание ученой степени доктора технических наук по специальности 05.13.07 – автоматизация процессов управления. – Национальный технический университет "Харьковский политехнический институт", Харьков, 2017. Диссертация посвящена решению научно-прикладной проблемы разработки бортовой системы поддержки принятия решений машинистом, созданной на основе обобщенных математических моделей и средств оптимизации динамики подвижных объектов с использованием новых методов и программного обеспечения, а также новой технологии обработки информации на основе стабильно-пластичных нейронных сетей и новых моделей ассоциативной памяти, что создает теоретическую предпосылку разработки автоматических систем управления подвижным составом и позволяет улучшить его энергетические характеристики. Разработана модель дизель-поезда, учитывающая основные виды колебаний вагонов и распределение сил взаимодействия между ними во время движения, а также параллельную работу тяговых приводов, которая адекватно отражает процессы, протекающие на реальном объекте. Разработано программное обеспечение, реализующее человеко-машинную систему, которая позволяет автоматизировать аналитические преобразования геометрической теории управления при синтезе моделей в форме Бруновского для объектов, описываемых системами обыкновенных дифференциальных уравнений высокого порядка с несколькими управлениями. На основе нейронных сетей адаптивной резонансной теории, способных решать задачи с несколькими решениями, предложен новый метод поиска функций преобразования между переменными линейных и нелинейных моделей. С помощью принципа максимума решены две задачи оптимального управления тяговым подвижным составом: максимального быстродействия и минимизации взвешенной линейной комбинации времени и расходов квадрата управления, что позволяет, с одной стороны, получить для каждого участка пути законы управления, которые определяют минимально необходимое время для преодоления перегона, а с другой стороны, получить законы управления, обеспечивающие график движения и минимизацию расхода топливно-энергетических ресурсов. Разработаны стабильно-пластичные нейронные сети Хемминга, Хебба и сети на основе перцептрона, способные распознавать новую информацию и дообучаться в процессе своего функционирования, которые позволяют использовать их как альтернативу дискретным нейронным сетям адаптивной резонансной теории. Разработана бортовая система поддержки принятия решений, которая позволяет в реальных условиях скоростного движения выдавать машинисту закон управления поездом, при котором соблюдается график движения при минимальных затратах топливно-энергетических ресурсов. Для реализации базы данных системы поддержки принятия решений машинистом разработана N-направленная нейросетевая ассоциативная память, которая способна восстанавливать по входному вектору множество из N векторов, ассоциативных к входной информации, и двунаправленная многослойная дискретная ассоциативная память с управляющими нейронами, которая способна восстанавливать цепочки ассоциаций и корректировать результаты с учетом дополнительной информации. Создана база знаний, позволяющая запоминать несколько равноценных решений о законе управления поездом для текущего перегона, нейросетевая система диагностики тяговых двигателей и система, позволяющая прогнозировать возникновение и подавлять развитие буксования колесных пар во время движения. Приведены результаты экспериментальных исследований интеллектуальной системы поддержки принятия решений машинистом и законов оптимального управления подвижным составом, которые подтвердили достоверность предложенных решений по автоматизации процессов управления движением дизель-поезда.<br>Dissertation for the degree of Doctor of Technical Science on Specialty 05.13.07 – Automation of Control Processes. – National Technical University "Kharkіv Polytechnic Institute", Kharkіv, 2017. The dissertation is devoted to solving scientific and applied problems of increasing energy characteristics of traction rolling stock, traction asynchronous drive through the development and introduction of new onboard intellectual decision support system of machinist that is based on more accurate models and optimize dynamics tools, using new methods and specialized software and new technologies of information processing based on stable and plastic neural networks and new models of associative memory, which creates theoretical precondition for developing of automatic control systems of high-speed rolling stock. Developed a comprehensive diesel-train model that takes into account the principal vibrations of the rolling stock and the distribution of forces of interaction between them, as well as parallel performance of traction drive of wagons with motors that adequately reflects the processes that occur on the real object. Developed specialized software that realizes the man-machine system that automates the conversion of analytical geometric control theory in the synthesis models to the Brunovsky form. Based on the neural networks that can solve problems with multiple solutions, developed a new method of search switching functions between the variables in the form of linear models Brunovsky and variables of nonlinear models control object. Developed stable-flexible Hamming neural networks, Hebb and other networks based on Perceptron, that can recognize new information and studying during its performance and modification Hamming neural network, capable identify several solutions. Has developed structure and components of the onboard intellectual decision support system that allows the actual use of the rolling stock and when happening currently changing road conditions, system can give for machinist new control laws under which adheres to a schedule for the least cost fuel and energy resources. Conducted experimental investigations on mathematical models and real object that confirming the correctness of the proposed solutions, methods and algorithms.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography