To see the other types of publications on this topic, follow the link: Functions of information management.

Dissertations / Theses on the topic 'Functions of information management'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Functions of information management.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Pensulo, Emilius M. "Integrating computer aided engineering functions: the management of information." Thesis, Aston University, 1987. http://publications.aston.ac.uk/11853/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kan, Xuan, and Junchao Lu. "The Functions of Information System in The Management of Corporate Social Responsibility." Thesis, Internationella Handelshögskolan, Högskolan i Jönköping, IHH, Informatik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-18991.

Full text
Abstract:
Background: Most organizations rely on their information system on a daily basis to generate opportunities and advantages. Meanwhile, with the increasing emphasize on corporate social responsibility (CSR), more directors and CEOs are shifting their sights to CSR by developing business models that underline responsible and ethical disciplines for running businesses. However, studies that combine the analysis of IS and CSR are few. Therefore, the potential benefits of utilizing IS from a more intan-gible view, which means CSR in our case, are less likely be discovered. Besides, sen-ior managers have a hard time on shifting perceptions on CSR from being an add-on activity to be integrated in core operations of the organizations. Aim & Purpose: The aim of our research is to investigate the functions that IS has in CSR management from managers' perspectives. As mentioned above, the main reason for con-duction this research is the existing knowledge gap through the subjects being investigated. The overall purpose for this study is to adopt EFQM Excellence Model and Work System Model to contribute to CSR value creation of the organization. Method: The current research approach is qualitative and a case study is applied as re-search strategy. Data collection is carried out by means of documents and interviews. Doc-uments include company annual reports and website information. Semi-structured inter-views are conducted with managers from Lasfosaringar Jokoping and PwC. Conclusions: The functions of IS in the management of CSR can be concluded into four aspects, which are information management, customer relationship management, monitor-ing daily affairs and corporate governance. Information system have changed the way data is being sorted, spread, disseminated and have accelerated the information exchange fre-quency in business operation. Those changes in turn reflect on the performance of cus-tomer, employee and corporate governance of the organization.
APA, Harvard, Vancouver, ISO, and other styles
3

Gajic, Borislava [Verfasser]. "Control and management plane helper functions for information-centric networks / Borislava Gajic." Aachen : Hochschulbibliothek der Rheinisch-Westfälischen Technischen Hochschule Aachen, 2013. http://d-nb.info/1046651692/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rackliffe, John A. "Evaluation of the Shipyard Management Information System (Material Management functions) at the Long Beach Naval Shipyard." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/27038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wen, Shenning. "The study, design, and implementation of Data mart functions in Windows environments." CSUSB ScholarWorks, 1998. https://scholarworks.lib.csusb.edu/etd-project/1374.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chao, Sam. "The design and implementation of object management functions for web-based repository." Thesis, University of Macau, 1999. http://umaclib3.umac.mo/record=b1636967.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ho, Estela Maria. "The roles and functions of information, communication and entertainment (ICE) in the service quality of an airport." Thesis, University of Macau, 2005. http://umaclib3.umac.mo/record=b1636648.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Franco, Muriel Figueredo. "Interactive visualizations for management of NFV-enabled networks." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2017. http://hdl.handle.net/10183/158202.

Full text
Abstract:
A Virtualização de Funções de Rede (Network Functions Virtualization - NFV) está mudando o paradigma das redes de telecomunicações. Esta nova tecnologia permite diversas oportunidades de inovações e possibilita o desenvolvimento de novos modelos de negócio. Em relação às redes NFV, os provedores de serviços têm a oportunidade de criar modelos de negócio que permitam aos clientes contratarem Funções de Rede Virtualizadas (Virtual Network Functions - VNFs) que proveem diferentes serviços de rede (e.g., Firewall, NAT e transcoders). Porém, nestes modelos, a quantidade de informações a serem gerenciadas cresce rapidamente. Baseado nisso, os operadores de rede devem ser capazes de entender e manipular uma grande quantidade de informação para gerenciar, de forma efetiva, as redes NFV. Para enfrentar esse problema, introduzimos uma plataforma de visualização denominada VISION, a qual tem como principal objetivo ajudar os operadores de rede na identificação da causa raiz de problemas em NFV. Para isso, propusemos: (i) uma abordagem para coleta e organização de dados do ambiente NFV gerenciado; (ii) cinco diferentes visualizações que auxiliam nas tarefas de gerenciamento de NFV como, por exemplo, no processo de identificação de problemas em VNFs e no planejamento de negócios e (iii) um modelo baseado em templates que suporta o desenvolvimento e o reuso de visualizações. Para fins de avaliação desta dissertação, foi desenvolvido um protótipo da plataforma VISION e de todas as visualizações propostas. Após, conduzimos um conjunto de casos de estudo para prover evidências sobre a viabilidade e utilidade de nossas visualizações. Os diferentes casos analisados, abordam por exemplo, a identificação de problemas na alocação de VNFs que estão impactando no desempenho do serviço oferecido e também na investigação de prioridades de investimento para suprir as demandas dos clientes da rede. Por fim, apresentamos uma avaliação de usabilidade realizada juntamente a especialistas em redes de computadores para avaliar os recursos e benefícios da plataforma VISION. Os resultados obtidos demonstram que nossas visualizações possibilitam ao operador de rede um rápido e fácil acesso às informações importantes para o gerenciamento de redes NFV, assim facilitando a obtenção de insights para a identificação de problemas complexos no contexto de redes NFV. Além disso, os resultados demonstram uma avaliação positiva por especialistas sobre os aspectos gerais de usabilidade do protótipo desenvolvido.
Network Functions Virtualization (NFV) is driving a paradigm shift in telecommunications networks and computer networks, by fostering new business models and creating innovation opportunities. In NFV-enabled networks, service providers have the opportunity to build a business model where tenants can purchase Virtual Network Functions (VNFs) that provide distinct network services and functions (e.g., Firewall, NAT, and transcoders). However, the amount of managed data grows in a fast pace. The network operator must understand and manipulate many data to effectively manage the network. To tackle this problem, we introduce VISION, a platform based on visualizations techniques to help network operators to determine the cause of not obvious problems. For this, we provide: (i) an approach to collect and organize data from the NFV environments; (ii) five distinct visualizations that can aid in NFV management tasks, such as in the process of identifying VNFs problems and planning of NFV-enabled businesses; and (iii) a template model that supports new visualization applications. To evaluate our work, we implemented a prototype of VISION platform and each of the proposed visualizations. We then conducted distinct case studies to provide evidence of the feasibility of our visualizations. These case studies cover different scenarios, such as the identification of misplacement of VNFs that are generating bottlenecks in a forwarding graph and the investigation of investment priorities to supply tenants demands. Finally, we present a usability evaluation with network operators to indicate the benefits of the VISION platform. The results obtained show that our visualizations allow the operator to access relevant information and have insights to identify not obvious problems in the context of NFV-enabled networks. In addition, we received positive feedback about general usability aspects related to our prototype.
APA, Harvard, Vancouver, ISO, and other styles
9

Choudhury, Randip. "How Marknadsdata information AB works withrelationship marketing : analysis of relationship marketing in Marknadsdata Information AB." Thesis, Högskolan i Gävle, Avdelningen för ekonomi, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-9816.

Full text
Abstract:
AbstractTitle: How Marknadsdata information AB works with relationship marketingLevel: Final Thesis for bachelor degree in Business AdministrationAuthor: Randip ChoudhurySupervisor: Dr. Aihie OsarenkhoePurpose: The purpose of this thesis is to examine how Marknadsdata information AB workswith their relationship marketing, and examine possible gaps between their strategy andimplementation of their relationship marketing. This study aims to provide insights on thecore components of CRM and the implementation of CRM strategy.Methodology: I have used primary data for this study by doing, interviews with ChristerJönsson at Marknadsdata AB to get his insight view of how the company operates to be ableto answer the purpose of this thesis which is to find out if there are any gaps in the way howMarknadsdata AB operates its business and how it resembles or differs with Donaldson’smodel. The secondary data I have used for this research is mainly from the Internet, articlesand literatureFindings: Marknadsdata information AB will not be successful on their market if they don’twork with their relationship marketing strategy. They have to keep in touch with customers,respond to the trends in the market and, in particular, to the changing demands of customers.For Marknadsdata information AB it’s important with relationship marketing both internallyand externally. Internal service quality for the company is about employee satisfaction whichultimately effects externally to customer satisfaction. By creating value to employees they cancreate value to customers.Research conclusions: Companies will not be successful on the market until they developtheir relationship marketing strategy. Companies need to keep in touch with customers,respond to the trends in the market and, in particular, to the changing demands of customers.Practical implications: By working with research and evaluating customers’ needs, which Ifound will reflect their future business. I think it’s also important that the management at thecompany works with employee’s on a regular basis. If the company doesn’t work with marketresearch and other customer generating mechanisms I think that the company is going to havedifficulties with their relationship work and to be able to make business at their market area.Value: The study shows customer relationship marketing strategy as a key to operate businessand implementing it in the organization. This thesis contributes a deeper understanding in thesubject relationship marketing.Keywords: Customer relations, Customer service management, Relationship marketing,Cross-functional integration, Management strategy
APA, Harvard, Vancouver, ISO, and other styles
10

Broadbent, Robert Emer. "A Functional Framework for Content Management." BYU ScholarsArchive, 2009. https://scholarsarchive.byu.edu/etd/1737.

Full text
Abstract:
This thesis proposes a functional framework for content management. This framework provides concepts and vocabulary for analysis and description of content management systems. The framework is derived from an analysis of eight content management systems. It describes forty-five conceptual functions organized into five functional groups. The functionality derived from the analysis of the content management systems is described using the vocabulary provided by the functional framework. Coverage of the concepts in the existing systems is verified. The utility of the framework is validated through the creation of a prototype that implements sufficient functionality to support a set of specific use cases.
APA, Harvard, Vancouver, ISO, and other styles
11

Nguema, Chancelia Gray Angounou. "The role of audit functions in enterprise resource planning projects in a selected organisation in South Africa." Thesis, Cape Peninsula University of Technology, 2018. http://hdl.handle.net/20.500.11838/2863.

Full text
Abstract:
Thesis (MTech (Business Information Systems))--Cape Peninsula University of Technology, 2018.
Enterprise resource planning (ERP) systems integrate business processes (BPs) into one database, facilitate data sharing, and provide real time information to authorised users, leading to an increase in efficiency and effectiveness. However, the implementation of an ERP system is not always a success as some systems turn out to be misaligned with the organisation’s objectives. This misalignment can lead to inadequate controls within the system. ERP systems are designed to improve transactions within the BPs and provide a competitive advantage to organisations. However, this benefit can become a weakness if project implementation fails due to controls in the system not being aligned with the objectives. The aim of the study is to explore how audit functions can contribute to the implementation of ERP projects, and the objective is to propose a guideline that can improve the implementation processes of ERP projects. To address the aim and meet the objective of this study, two main questions are asked: 1) What are the factors to be considered when introducing audit functionality in the implementation of an ERP system? 2) How can audit functions assist organisations in ERP project implementation? A subjectivist philosophical stance is followed and the epistemology lies within the interpretivist paradigm. An inductive research approach is followed and a case study is used as research strategy to conduct the research. The unit of analysis is the Operation Finance and Information Technology departments within the selected organisation, while selected employees (14) within the organisation form the unit of observation. A non-random, purposively selected sampling technique was used. Data were collected by means of semi-structured questionnaires through interviews. Data were analysed by summarising, categorising, and applying thematic analysis. The data analysis shows that audit functions (Operation Finance department, internal and external auditors) bring objectivity and assurance to the project in terms of financial reports, checks and balances, processes, structure, and internal controls. Getting people to cooperate however is a challenge for audit functionalities, and internal and external auditors can be a challenge during project implementation because their practical skills and computer-based knowledge to deal with huge volumes of data is extremely limited. It is highly recommended that the guideline presented in this research is followed, that engagement of audit functions with business processes is introduced and adopted by other role players involved in the project implementation process, and that audit functions should not be seen as a ‘must have’ but rather as support to improve the process. Ethical requirements as requested by CPUT are fulfilled.
APA, Harvard, Vancouver, ISO, and other styles
12

Abd, Hadi Zakaria. "Tradable information function in government organisations : a cross cultural study." Thesis, De Montfort University, 2002. http://hdl.handle.net/2086/4321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Myers, Barry L. "Information systems assessment: development of a comprehensive framework and contingency theory to assess the effectiveness of the information systems function." Thesis, University of North Texas, 2003. https://digital.library.unt.edu/ark:/67531/metadc4302/.

Full text
Abstract:
The purpose of this research is to develop a comprehensive, IS assessment framework using existing IS assessment theory as a base and incorporating suggestions from other disciplines. To validate the framework and to begin the investigation of current IS assessment practice, a survey instrument was developed. A small group of subject matter experts evaluated and improved the instrument. The instrument was further evaluated using a small sample of IS representatives. Results of this research include a reexamination of the IS function measurement problem using new frameworks of analyses yielding (a) guidance for the IS manager or executive on which IS measures might best fit their organization, (b) a further verification of the important measures most widely used by IS executives, (c) a comprehensive, theoretically-derived, IS assessment framework, and by (d) the enhancement of IS assessment theory by incorporating ideas from actual practice. The body of knowledge gains a comprehensive, IS assessment framework that can be further tested for usefulness and applicability. Future research is recommended to substantiate and improve on these findings. Chapter 2 is a complete survey of prior research, subdivided by relevant literature divisions, such as organizational effectiveness, quality management, and IS assessment. Chapter 3 includes development of and support for the research questions, IS assessment framework, and the research model. Chapter 4 describes how the research was conducted. It includes a brief justification for the research approach, a description of how the framework was evaluated, a description of how the survey instrument was developed and evaluated, a description of the participants and how they were selected, a synopsis of the data collection procedures, a brief description of follow-up procedures, and a summary. Chapter 5 presents the results of the research. Chapter 6 is a summary and conclusion of the research. Finally, included in the appendices are definitions of terms, and copies of the original and improved survey instruments.
APA, Harvard, Vancouver, ISO, and other styles
14

Mills, John Barry. "The role of information technology in cross-functional integration." Thesis, University of the West of England, Bristol, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.283597.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Marklund, Mikael. "Cross-functional product information process in a de-centralized organization." Thesis, Blekinge Tekniska Högskola, Sektionen för management, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-1150.

Full text
Abstract:
Changes in big companies resulting in new organizational structures and cost cutting are pushing more and more of the knowledge and information handling to sub-units in a multi-national structure. For big knowledge-intensive companies that act in the global market place, internal information handling is becoming a challenge. The study and reflections are based on experiences from Ericsson, a knowledge-intensive global telecommunication company. This company delivers complex cross-functional products (solutions) and has a decentralized organization. It faces the cost of managing distributed product information and the challenge to gather relevant information in the sales departments. One can easily characterize the company’s complex and unique product offerings as having multiple dependencies. The solutions are composed by building blocks, i.e. different sub products, delivered by different product units. The different sub products suffer from limitations in how they can be combined into solutions. This study addresses the information gaps in a decentralized organization regarding this specific issue. It focuses on identifying vital information without driving cost and requiring organizational changes. Stakeholder identification was done from a value chain perspective. The type of information that would give the most profitable solutions was identified during group sessions and individual interviews. An asymmetric compatibility matrix (ACM) was developed to fit the purpose of keeping low maintenance cost and without requiring organizational changes. The ACM was applied and process maturity improvements were evaluated with the use of the Process Enterprise and Maturity Model. The Ericsson specific study shows that the use of an ACM for product compatibility information makes it possible to define information responsibility that is sustainable over time. Thereby the maintenance cost for this information can be brought down to a minimum. Furthermore, the study shows that the effort of gathering information for the sales organizations to provide customer solutions can be reduced by the use of an ACM offering generic compatibility information. Users of the ACM would be able to re-use its information and focus on customer specific sales and deployment issues rather than re-do what others already have done. Cost of sales as well as business risks would thereby likely decrease, affecting the bottom line positively. Furthermore new business opportunities are assumed to be addressed better since relevant generic information is made available up front. Other positive expected benefits are prevention of network malfunctions and increased customer satisfaction. From this study it can be concluded that an ACM can be a powerful tool for gathering cross-functional product information in large decentralized organizations at a low cost, without any organizational changes, and with a high process maturity. Further research would be needed if one would consider validating the general applicability of the ACM in other processes.
APA, Harvard, Vancouver, ISO, and other styles
16

Potter, Scott Steven. "The development of temporal and functional information displays to support cooperative fault management /." The Ohio State University, 1994. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487849377295681.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Buckl, Sabine M. [Verfasser], Pontus [Akademischer Betreuer] Johnson, and Florian [Akademischer Betreuer] Matthes. "Developing Organization-Specific Enterprise Architecture Management Functions Using a Method Base / Sabine Buckl. Gutachter: Pontus Johnson. Betreuer: Florian Matthes." München : Universitätsbibliothek der TU München, 2011. http://d-nb.info/1014412560/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Leung, Tsui-shan. "A functional analysis of GIS for slope management in Hong Kong /." Hong Kong : University of Hong Kong, 2000. http://sunzi.lib.hku.hk/hkuto/record.jsp?B22032447.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

LIN, HUAYI. "Balancing Stakeholder Interests for Sustainable Wolf Population Management in Sweden." Thesis, Uppsala universitet, Institutionen för geovetenskaper, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-193267.

Full text
Abstract:
In this paper, Swedish wolf population management was analyzed by two models. One is information index system to measure the positive and normative information held by the stakeholders. By using this system, the quantitative information indices can be measured to give people a clear understanding of the current situation of themselves and other stakeholders in order to enhance communication and engagement. By testing the current information shortage from the national survey, it was confirmed that the information shortage do exist in the society and the need for improving the information acquirement is significant. Besides, satisfaction functions of positive and negative stakeholders towards wolf population were used to test an agreeable wolf population. Stakeholder satisfaction was expressed as a function of wolf pupulation, either positively or negatively correlated, using economic and social features such as taxes, compensation, preventative payments, lupophobia, biphilia, etc. Weights were given to derive overall goal functions for pro- and anti-wolf stakeholders in order to find if there exist wolf population levels which might indicate include common preference. Whereas, current wolf pupulation is around 210 in Sweden, the results showed that all stakeholder groups could be satisfied with a population of around 500 wolves. Some major policy measures were studied for their influence on stakeholders' interests, in particular in how to increase wolf population in order to achieve this solution.
APA, Harvard, Vancouver, ISO, and other styles
20

Peak, Daniel Alan. "The Risks and Effects of Outsourcing on the Information Systems Function and the Firm." Thesis, University of North Texas, 1994. https://digital.library.unt.edu/ark:/67531/metadc279257/.

Full text
Abstract:
IS outsourcing, especially large-scale IS outsourcing, is a comparatively recent and rapidly growing IS phenomenon, but it is also an inherently risky activity. In an IS outsourcing arrangement, the outsourcing vendor accepts responsibility for IS resources and functions formerly controlled directly by the firm. This research examines IS outsourcing from two perspectives. (1) From an IS perspective, it examines the risk perceptions of IS managers of fourteen Fortune-500 firms who had recently conducted an outsourcing evaluation. (2) From a financial perspective, it examines the theoretical relationship of IS outsourcing with financial performance, and investigates the empirical effects of IS outsourcing on the firm's market value and market risk. This research views IS outsourcing as an independent variable whose effects on the firm may be measured as changes in security returns, changes in asset risk, changes in capital structure, and long-term changes in profitability. To accomplish this, it characterizes IS outsourcing as a sale-and-leaseback transaction.
APA, Harvard, Vancouver, ISO, and other styles
21

Ewerstein, Anders, and Markus Jansson. "Management method for Change Management in ERP systems." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-177379.

Full text
Abstract:
Målet med detta arbete är att förbättra Spotifys processer när de gör förändringar internt som påverkar deras finansiella system. Arbetet har genomförts genom att kartlägga de team hos Spotify som skapar, påverkar och använder data i det finansiella systemet. Vidare har de olika teamens arbetssätt jämförts och skillnader i hur de arbetar identifierats. Baserat på de identifierade skillnaderna påvisas vilka utmaningar som finns när Spotify genomför förändringar där flera team är involverade. Kartläggningen visar att det finns både olika arbetssätt och grundinställning hos de olika teamen som skapar utmaningar i förändringsarbetet, speciellt i tvärfunktionella projekt. Slutsatser från resultatet presenteras i form av några olika förslag vilka kan hjälpa till att förbättra förändringsarbetet vid tvärfunktionella projekt hos Spotify. Skapa en koordinatorroll som fungerar som en gemensam kontakt för alla ärenden som genererar förändringar i det finansiella systemet. Genom den nya rollen tydliggör man informationsflödet. Skapa ett nytt team som ansvarar för alla förändringar som påverkar det finansiella systemet. Ett nytt team minskar beroenden mellan de olika teamen. Använda sig av tvärfunktionella projektledare som får dedikerade resurser att driva projekt där flera team är inblandade. Projektledaren kan då äga och ta ansvar för hela processen. Förändra så att alla inblandade team mäts på värdeskapandet för hela kedjan istället för att bedömas när de levererat sin del. Anordna en kortare workshop (1-5 dagar) med deltagare från de team som behöver interagera med varandra vid en förändring. Det blir ett effektivt sätt att minimera ledtiden mellan teamen. Skapa serviceavtal mellan de olika teamen, så att varje team effektivare kan planera sina resurser och veta vilka förutsättningar som gäller.
The objective of this thesis is to help Spotify to improve their internal change management process in their financial system. The work was done by charting the internal teams that create, influence and use data in the company’s financial system. Furthermore, the working methods of the different teams were compared and differences in how they work were identified. Our findings show that there are both different working processes and attitudes between the teams. This creates challenges in the change management process, especially in cross-functional projects. Conclusions from the results are presented as suggestions. These may help to improve the change management process in cross-functional projects at Spotify: Implement a coordinator role, which works as a single point of contact for everything that is related to changes in the financial system. The new role helps to improve the distribution of information. Create a new team that is responsible for all changes that affect the financial system. A new team reduces dependencies betweenthe different teams. Introduce cross-functional project managers who receive dedicated resources to implement projects where multiple teams are involved. The project manager can then take full responsibility for the entire process. Make sure the effectiveness/productivity of all involved teams is measured by the value created for the whole value chain rather than when their respective part has been delivered. Have a workshop (1-5 days) in which participants from the teams that needs to integrate components work together. This will be an effective way to minimize the waiting time between the teams. Create a service level agreement between the different teams, so that each team can efficiently plan their resources and know what to expect from other teams.
APA, Harvard, Vancouver, ISO, and other styles
22

Paula, Danúzia da Rocha de. "Gestão da informação na Fiocruz: um modelo de análise." reponame:Repositório Institucional da UFF, 2011. https://appdesenv.uff.br/riuff/handle/1/354.

Full text
Abstract:
Submitted by Maria Dulce (mdulce@ndc.uff.br) on 2014-08-25T18:07:44Z No. of bitstreams: 1 Dissertacao_Danuzia de Paula-2011.pdf: 2108872 bytes, checksum: 9e7b2ef3b2a726cd22cacfb1cbe739b7 (MD5)
Approved for entry into archive by Jane Alice (janealice@ndc.uff.br) on 2014-08-25T19:02:02Z (GMT) No. of bitstreams: 1 Dissertacao_Danuzia de Paula-2011.pdf: 2108872 bytes, checksum: 9e7b2ef3b2a726cd22cacfb1cbe739b7 (MD5)
Made available in DSpace on 2014-08-25T19:02:02Z (GMT). No. of bitstreams: 1 Dissertacao_Danuzia de Paula-2011.pdf: 2108872 bytes, checksum: 9e7b2ef3b2a726cd22cacfb1cbe739b7 (MD5) Previous issue date: 2011
Universidade Federal Fluminense
O presente trabalho se propõe a elaborar um modelo de análise da Gestão da Informação na Fundação Oswaldo Cruz – Fiocruz, que é uma instituição governamental, produtora de um grande volume de informação em circulação. O instrumento, elaborado com base em três modelos voltados a órgãos governamentais, foi testado e possibilitou identificar o nível de amadurecimento da Gestão da Informação. Para a realização deste trabalho foi realizada uma pesquisa com os gestores e os especialistas da Instituição. O estudo foi realizado em uma unidade técnico-administrativa e em quatro unidades técnico-científicas. De acordo com o estudo realizado a Gestão da Informação encontra-se no nível forte. Porém, identificou-se que algumas atividades estão implementadas, mas não estão muito desenvolvidas ou se encontram ainda pouco disseminadas na instituição. Propõe-se para a Fiocruz a ampliação do instrumento apresentado e sua aplicação em todas as unidades da instituição.
This paper aims to develop an analytical model of Information Management at the Oswaldo Cruz Foundation – Fiocruz, which is a government institution, producer of a large volume of information in circulation. The instrument was developed based on three models targeted at government agencies, has been tested and enabled us to identify the level of maturity of Information Management. For the preparation of this work was carried out a survey with the managers and specialists of the institution. The study was performed in a technical-administrative unit and in four technical-scientific units. According to the performed study the Information Management is at the strong level. However, it was found that some activities are implemented, but not are very developed or are poorly disseminated within the institution. It is proposed for Fiocruz expanding the instrument presented and its application in all the units of the institution.
APA, Harvard, Vancouver, ISO, and other styles
23

Sui, Liqi. "Uncertainty management in parameter identification." Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2330/document.

Full text
Abstract:
Afin d'obtenir des simulations plus prédictives et plus précises du comportement mécanique des structures, des modèles matériau de plus en plus complexes ont été développés. Aujourd'hui, la caractérisation des propriétés des matériaux est donc un objectif prioritaire. Elle exige des méthodes et des tests d'identification dédiés dans des conditions les plus proches possible des cas de service. Cette thèse vise à développer une méthodologie d'identification efficace pour trouver les paramètres des propriétés matériau, en tenant compte de toutes les informations disponibles. L'information utilisée pour l'identification est à la fois théorique, expérimentale et empirique : l'information théorique est liée aux modèles mécaniques dont l'incertitude est épistémique; l'information expérimentale provient ici de la mesure de champs cinématiques obtenues pendant l'essai ct dont l'incertitude est aléatoire; l'information empirique est liée à l'information à priori associée à une incertitude épistémique ainsi. La difficulté principale est que l'information disponible n'est pas toujours fiable et que les incertitudes correspondantes sont hétérogènes. Cette difficulté est surmontée par l'utilisation de la théorie des fonctions de croyance. En offrant un cadre général pour représenter et quantifier les incertitudes hétérogènes, la performance de l'identification est améliorée. Une stratégie basée sur la théorie des fonctions de croyance est proposée pour identifier les propriétés élastiques macro et micro des matériaux multi-structures. Dans cette stratégie, les incertitudes liées aux modèles et aux mesures sont analysées et quantifiées. Cette stratégie est ensuite étendue pour prendre en compte l'information à priori et quantifier l'incertitude associée
In order to obtain more predictive and accurate simulations of mechanical behaviour in the practical environment, more and more complex material models have been developed. Nowadays, the characterization of material properties remains a top-priority objective. It requires dedicated identification methods and tests in conditions as close as possible to the real ones. This thesis aims at developing an effective identification methodology to find the material property parameters, taking advantages of all available information. The information used for the identification is theoretical, experimental, and empirical: the theoretical information is linked to the mechanical models whose uncertainty is epistemic; the experimental information consists in the full-field measurement whose uncertainty is aleatory; the empirical information is related to the prior information with epistemic uncertainty as well. The main difficulty is that the available information is not always reliable and its corresponding uncertainty is heterogeneous. This difficulty is overcome by the introduction of the theory of belief functions. By offering a general framework to represent and quantify the heterogeneous uncertainties, the performance of the identification is improved. The strategy based on the belief function is proposed to identify macro and micro elastic properties of multi-structure materials. In this strategy, model and measurement uncertainties arc analysed and quantified. This strategy is subsequently developed to take prior information into consideration and quantify its corresponding uncertainty
APA, Harvard, Vancouver, ISO, and other styles
24

Aluebhosele, Dandy, and George Anobah. "CHIEF INFORMATION OFFICERS EVOLVING ROLES AND RESPONSIBILITIES "From Operational to Strategic"." Thesis, Mälardalen University, Mälardalen University, Mälardalen University, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-5578.

Full text
Abstract:

The Chief information officer (CIO) position has been seen as very important to every organization; this includes organizations that have either outsourced   or Insourced their IT function. Various studies have shown that this role emerged as a critical executive position in most organization which helps to shape organizations strategy. CIO has a major responsibility of aligning IT with business strategy that leads to an organization achieving a higher competitive advantage. This  work describeD the various roles of the CIO in organizations with a special focus on IT-business strategy alignment.

Based on our investigations from previous research, case studies and current interviews with CIOs, we were able to see that the CIO roles are shifting from operational to more strategic one. The CIO is seen to be the bridge between IT strategy and business strategy. As a result of this, they have close collaboration with the CEOs in order to be successful in aligning IT strategy to the business objectives. In view of this, the CIO plays the role of both the chief architect who designs future possibilities for business and the technology provocateur (Intelligent officer) that aligns IT with business.

APA, Harvard, Vancouver, ISO, and other styles
25

Ball, Richard. "Operational effectiveness of the information technology function in business process change: A case study in a financial services firm." Master's thesis, University of Cape Town, 2016. http://hdl.handle.net/11427/22757.

Full text
Abstract:
In order to address the need to remain flexible in dynamic business environments, organisations must focus on the effectiveness of their core operational processes. The importance of operational effectiveness has been claimed to have a direct influence on business performance. In order to improve their effectiveness, many organisations invest in information technology (IT) systems, even though the extent to which these technological initiatives influence operational effectiveness is considered to be largely misunderstood by the organisations who employ them. In this dissertation, the relationship between the Operations and IT departments of a financial services firm is investigated. This study pays particular attention to the factors that have the potential to influence the ability of the organisation to align its strategies. This enquiry takes the form of two distinct research questions: 1) What factors in the organisation have an impact on the success of business process change proposals? 2) How is the role of IT perceived in the preparation of business process change initiatives? The study involved conducting semi-structured interviews with members of both departments. A qualitative inductive approach was used to analyse the data collected from these interviews in order to identify themes. The emergent phenomena were then considered in conjunction with the literature on organisational effectiveness and strategic alignment, in order to develop a theory that answers the research questions. The findings of the theory that developed resulted in four main relationships. They were: how understanding business processes contributes to improved service delivery; how important communication is in contributing to organisational performance; how effective planning has an impact on product complexity; and the impact that effective organisational planning has on the relationship between IT and operations. The results of this study showed that although there was intention to improve alignment between business and IT strategies, with some noteworthy initiatives emerging, there have been a number of factors inhibiting successful alignment. Some of these factors include: a lack of trust in IT solution delivery, IT remaining ignorant to the impact of process changes, the inability to effectively allocate the business analysis function to the correct change proposals, and the silos of process knowledge that exist within operations. The recommendations of this study include: improvements to the visibility of business processes; methods to improve knowledge sharing; and strengthening the focus of the business analysis function.
APA, Harvard, Vancouver, ISO, and other styles
26

Eichhorn, Bradford Reese. "THE IMPACT OF USER INVOLVEMENT ON INFORMATION SYSTEM PROJECTS." Cleveland State University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=csu1410793063.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Ahmed, Mohamed. "Multi-Level Safety Performance Functions for High Speed Facilities." Doctoral diss., University of Central Florida, 2012. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5091.

Full text
Abstract:
High speed facilities are considered the backbone of any successful transportation system; Interstates, freeways, and expressways carry the majority of daily trips on the transportation network. Although these types of roads are relatively considered the safest among other types of roads, they still experience many crashes, many of which are severe, which not only affect human lives but also can have tremendous economical and social impacts. These facts signify the necessity of enhancing the safety of these high speed facilities to ensure better and efficient operation. Safety problems could be assessed through several approaches that can help in mitigating the crash risk on long and short term basis. Therefore, the main focus of the research in this dissertation is to provide a framework of risk assessment to promote safety and enhance mobility on freeways and expressways. Multi-level Safety Performance Functions (SPFs) were developed at the aggregate level using historical crash data and the corresponding exposure and risk factors to identify and rank sites with promise (hot-spots). Additionally, SPFs were developed at the disaggregate level utilizing real-time weather data collected from meteorological stations located at the freeway section as well as traffic flow parameters collected from different detection systems such as Automatic Vehicle Identification (AVI) and Remote Traffic Microwave Sensors (RTMS). These disaggregate SPFs can identify real-time risks due to turbulent traffic conditions and their interactions with other risk factors. In this study, two main datasets were obtained from two different regions. Those datasets comprise historical crash data, roadway geometrical characteristics, aggregate weather and traffic parameters as well as real-time weather and traffic data. At the aggregate level, Bayesian hierarchical models with spatial and random effects were compared to Poisson models to examine the safety effects of roadway geometrics on crash occurrence along freeway sections that feature mountainous terrain and adverse weather. At the disaggregate level; a main framework of a proactive safety management system using traffic data collected from AVI and RTMS, real-time weather and geometrical characteristics was provided. Different statistical techniques were implemented. These techniques ranged from classical frequentist classification approaches to explain the relationship between an event (crash) occurring at a given time and a set of risk factors in real time to other more advanced models. Bayesian statistics with updating approach to update beliefs about the behavior of the parameter with prior knowledge in order to achieve more reliable estimation was implemented. Also a relatively recent and promising Machine Learning technique (Stochastic Gradient Boosting) was utilized to calibrate several models utilizing different datasets collected from mixed detection systems as well as real-time meteorological stations. The results from this study suggest that both levels of analyses are important, the aggregate level helps in providing good understanding of different safety problems, and developing policies and countermeasures to reduce the number of crashes in total. At the disaggregate level, real-time safety functions help toward more proactive traffic management system that will not only enhance the performance of the high speed facilities and the whole traffic network but also provide safer mobility for people and goods. In general, the proposed multi-level analyses are useful in providing roadway authorities with detailed information on where countermeasures must be implemented and when resources should be devoted. The study also proves that traffic data collected from different detection systems could be a useful asset that should be utilized appropriately not only to alleviate traffic congestion but also to mitigate increased safety risks. The overall proposed framework can maximize the benefit of the existing archived data for freeway authorities as well as for road users.
ID: 031988164; System requirements: World Wide Web browser and PDF reader.; Mode of access: World Wide Web.; Thesis (Ph.D.)--University of Central Florida, 2012.; Includes bibliographical references.
Ph.D.
Doctorate
Civil, Environmental, and Construction Engineering
Engineering and Computer Science
Civil Engineering
APA, Harvard, Vancouver, ISO, and other styles
28

Bimerew, Million S. "Developing a framework for a district-based information management system for mental health care in the Western Cape." Thesis, University of Western Cape, 2013. http://hdl.handle.net/11394/3324.

Full text
Abstract:
Philosophiae Doctor - PhD
A review of the literature has shown that there is a lack of mental health information on which to base planning of mental health services and decisions concerning programme development for mental health services. Several studies have indicated that the use of an evidence-based health information system (HIS) reduces inappropriate clinical practices and promotes the quality of health care services. This study was aimed at developing a framework for a district-based mental health information management system, utilising the experiences of health care providers and caregivers about a district mental health information system (DMHIS). Activity Theory was used as the philosophical foundation of the information system for the study. A qualitative approach was employed using semi-structured individual interviews, Focus Group Discussions (FGDs), systematic review and document analysis. The intervention research design and development model of Rothman and Thomas (1994) was used to guide the study, which was conducted in the Cape Town Metropole area of the Western Cape. A purposive, convenient sampling method was employed to select study participants. Ethical clearance for the study was obtained from the University of the Western Cape, and permission to use the health facilities from the Department of Health. The data collection process involved 62 individual interview participants, from mental health nurses to district health managers, health information clerks, and patient caregivers/families and persons with stable mental conditions. Thirteen caregivers took part in the FGDs. Document review was conducted at three community mental health centres. The data were analysed manually using content analysis. Core findings of the interviews were lack of standardized information collection tools and contents for mental health, information infrastructure, capacity building, and resources. Information processing in terms of collection, compiling, analysing, feedback, access and sharing information were the major problems. Results from document analysis identified inconsistencies and inaccuracies of information recording and processing, which in turn affected the quality of information for decision making. Results from the systematic review identified five functional elements: organizational structure; information infrastructure; capacity building; inputs, process, output and feedback; and community and stakeholders’ participation in the design and implementation of a mental health information system (MHIS). The study has contributed a framework for a DMHIS based on the findings of the empirical and systematic review. It is recommended that there is a need to establish a HIS committee at district health facility level for effective implementation of the framework and quality information processing. There is a need to ensure that staffs have adequate knowledge and skills required for effective implementation of an information system. It is recommended that higher education institutions include a course on HISs in their curriculum. It is suggested that the South African Mental Health Policy be reviewed to include an MHIS and ensure involvement of the community and stakeholders in this system as well as adequate budget allocation.
APA, Harvard, Vancouver, ISO, and other styles
29

Donald, Ann Jean. "Landscape function analysis and ecological management of an agricultural landscape." Thesis, Stellenbosch : University of Stellenbosch, 2005. http://hdl.handle.net/10019.1/2842.

Full text
Abstract:
Thesis (MSc (Geography and Environmental Studies))--University of Stellenbosch, 2005.
In the past, development was allowed in agricultural areas which would not be acceptable under current planning policy. There is a growing need to develop and maintain highly productive and ecologically stable agricultural systems. One approach to encourage better land management and utilisation is the international certification of a farm’s production practices.
APA, Harvard, Vancouver, ISO, and other styles
30

Singh, Rajput Shivaram. "Increasing efficiency in ECU function development for Battery Management Systems." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-206088.

Full text
Abstract:
In the context of automotive industries today, the focus of ECU function development is always on finding the best possible combinations of control algorithms and parameter. The complex algorithms with broad implementation range requires optimal calibration of ECU parameters to achieve the desired behaviour during the drive cycle of the vehicle. With the growing function complexity of automotive E/E Systems, the traditional approaches of designing the automotive embedded systems are not suitable. In order to overcome the challenge of complexity, many of the leading automotive companies have formed a partnership in order to develop and establish an open industry standard for automotive E/E architecture called AUTOSAR. In this thesis, toolchain for ECU function development following AUTOSAR standard and an efficient measurement and calibration mechanism using XCP on CAN will be investigated and implemented. Two toolchains will be proposed in this thesis, describing their usage in different stages of ECU function development and in calibration. Both these toolchains will be tested to prove its working.
I området utveckling av funktionalitet på elektroniska styrsystem inom bilindustrin idag, ligger fokus på att finna den bästa kombinationen av reglermetoder och styrparametrar. Dessa avancerade system, med breda användningsområden, kräver bästa möjliga injustering av dess kalibrerbara parametrar, för att nå önskat beteende vid användning av fordonet. Det ökande omfånget av funktionskraven på styrsystemen, innebär att sedvanlig metodik för utveckling av dessa system inte är lämplig. För att kunna lösa dessa svårigheter, har de stora inom bilindustrin ingått ett samarbete, där de tillsammans skapat och utvecklar en industristandard för funktionsoch systemutveckling av styrsystem. Standarden kallas AUTOSAR. Denna rapport beskriver hur en kedja av utvecklingsverktyg som följer AUTOSAR-standarden kan användas, för att undersöka och använda en metod för systemövervakning och parameterkalibrering, genom användning av XCP över CAN.
APA, Harvard, Vancouver, ISO, and other styles
31

Leung, Tsui-shan, and 梁翠珊. "A functional analysis of GIS for slope management in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31223072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Hiscock, Anna Magdalena Kumi. "The role and function of communities of practice as a tool for organizational learning." Thesis, Stellenbosch : Stellenbosch University, 2005. http://hdl.handle.net/10019.1/50517.

Full text
Abstract:
Thesis (MPhil)--Stellenbosch University, 2005.
ENGLISH ABSTRACT: The objective of this study is to gain an understanding of the role and function of Communities of Practice in managing organizational knowledge. This thesis views the organization as a learning system and focuses on key characteristics of a learning organization and Communities of Practice. The interrelationship as well as the details of these themes is described in detail. Some organizations seek to become learning organizations. Yet, implementation is elusive and is not often based on research about what constitutes a learning culture. The goals of this study is to review and analyze the key characteristics of learning organizations and Communities of Practice how they develop, and where does one start if a learning organization is to be created. A qualitative research methodology to answer the why and how question is followed to describe, explain and interpret the findings. The literature review covers specific definitions, aspects and general factors concerning CoP's and learning organizations.
AFRIKAANSE OPSOMMING: Die doel van die studie is om die kenmerke van 'n leerorganisasie te verstaan en te beskryf, asook die doel en rol wat kennisgemeenskappe kan speel binne die konteks van so 'n organisasie om kennis te deel en te bestuur. Alhoewel sommige organisasies 'n werklike behoefte het om kennis te deel en te bestuur en uiteindelik as 'n leerorganisasie bekend te staan, is daar steeds as gevolg van onvoldoende navorsing probleme met suksesvolle implementering. Hierdie tesis poog om kennisgemeenskappe as een van die vele maniere om suksesvol te implementeer te beskryf. 'n Kwalitatiewe navorsingsmetode is gevolg om die hoekom en hoe vrae te beskryf, verduidelik en te interpreteer in die bevindinge. Die literatuurstudie sluit definisies, aspekte en algemene faktore rakende kennisgemeenskappe en leerorganisasies in.
APA, Harvard, Vancouver, ISO, and other styles
33

Ha, Wai On. "Empirical studies toward DRP constructs and a model for DRP development for information systems function." HKBU Institutional Repository, 2002. http://repository.hkbu.edu.hk/etd_ra/432.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Groth, Philip. "Knowledge management and discovery for genotype/phenotype data." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät II, 2009. http://dx.doi.org/10.18452/16033.

Full text
Abstract:
Die Untersuchung des Phänotyps bringt z.B. bei genetischen Krankheiten ein Verständnis der zugrunde liegenden Mechanismen mit sich. Aufgrund dessen wurden neue Technologien wie RNA-Interferenz (RNAi) entwickelt, die Genfunktionen entschlüsseln und mehr phänotypische Daten erzeugen. Interpretation der Ergebnisse solcher Versuche ist insbesondere bei heterogenen Daten eine große Herausforderung. Wenige Ansätze haben bisher Daten über die direkte Verknüpfung von Genotyp und Phänotyp hinaus interpretiert. Diese Dissertation zeigt neue Methoden, die Entdeckungen in Phänotypen über Spezies und Methodik hinweg ermöglichen. Es erfolgt eine Erfassung der verfügbaren Datenbanken und der Ansätze zur Analyse ihres Inhalts. Die Grenzen und Hürden, die noch bewältigt werden müssen, z.B. fehlende Datenintegration, lückenhafte Ontologien und der Mangel an Methoden zur Datenanalyse, werden diskutiert. Der Ansatz zur Integration von Genotyp- und Phänotypdaten, PhenomicDB 2, wird präsentiert. Diese Datenbank assoziiert Gene mit Phänotypen durch Orthologie über Spezies hinweg. Im Fokus sind die Integration von RNAi-Daten und die Einbindung von Ontologien für Phänotypen, Experimentiermethoden und Zelllinien. Ferner wird eine Studie präsentiert, in der Phänotypendaten aus PhenomicDB genutzt werden, um Genfunktionen vorherzusagen. Dazu werden Gene aufgrund ihrer Phänotypen mit Textclustering gruppiert. Die Gruppen zeigen hohe biologische Kohärenz, da sich viele gemeinsame Annotationen aus der Gen-Ontologie und viele Protein-Protein-Interaktionen innerhalb der Gruppen finden, was zur Vorhersage von Genfunktionen durch Übertragung von Annotationen von gut annotierten Genen zu Genen mit weniger Annotationen genutzt wird. Zuletzt wird der Prototyp PhenoMIX präsentiert, in dem Genotypen und Phänotypen mit geclusterten Phänotypen, PPi, Orthologien und weiteren Ähnlichkeitsmaßen integriert und deren Gruppierungen zur Vorhersage von Genfunktionen, sowie von phänotypischen Wörtern genutzt.
In diseases with a genetic component, examination of the phenotype can aid understanding the underlying genetics. Technologies to generate high-throughput phenotypes, such as RNA interference (RNAi), have been developed to decipher functions for genes. This large-scale characterization of genes strongly increases phenotypic information. It is a challenge to interpret results of such functional screens, especially with heterogeneous data sets. Thus, there have been only few efforts to make use of phenotype data beyond the single genotype-phenotype relationship. Here, methods are presented for knowledge discovery in phenotypes across species and screening methods. The available databases and various approaches to analyzing their content are reviewed, including a discussion of hurdles to be overcome, e.g. lack of data integration, inadequate ontologies and shortage of analytical tools. PhenomicDB 2 is an approach to integrate genotype and phenotype data on a large scale, using orthologies for cross-species phenotypes. The focus lies on the uptake of quantitative and descriptive RNAi data and ontologies of phenotypes, assays and cell-lines. Then, the results of a study are presented in which the large set of phenotype data from PhenomicDB is taken to predict gene annotations. Text clustering is utilized to group genes based on their phenotype descriptions. It is shown that these clusters correlate well with indicators for biological coherence in gene groups, such as functional annotations from the Gene Ontology (GO) and protein-protein interactions. The clusters are then used to predict gene function by carrying over annotations from well-annotated genes to less well-characterized genes. Finally, the prototype PhenoMIX is presented, integrating genotype and phenotype data with clustered phenotypes, orthologies, interaction data and other similarity measures. Data grouped by these measures are evaluated for theirnpredictiveness in gene functions and phenotype terms.
APA, Harvard, Vancouver, ISO, and other styles
35

Hackney, Raymond A. "The role of the information systems function for local government competitive services : an interpretive analysis of the management of change." Thesis, Cranfield University, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.483427.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

van, 't Hof David M. "Service Provisioning in SDN using a Legacy Network Management System." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-204957.

Full text
Abstract:
Software Defined Networking (SDN) has become increasingly popular in combination with Network Function Virtualization (NFV). SDN is a way to make a network more programmable and dynamic. However, in order to create a homogeneous network using this concept, legacy equipment will have to be substituted by SDN equipment, which is costly. To close the gap between the legacy world and SDN, we introduce the concept of a legacy Network Management System (NMS) that is connected to an SDN controller to perform service provisioning. This way, the NMS is capable of configuring both legacy as well as SDN networks to provide customers with the services that they have ordered, while still allowing for new SDN features in the SDN domain of the network. The main service we wish to provide using SDN is Service Function Chaining (SFC). Service provisioning consists of dynamically constructing a path through the ordered network services, in this case Virtual Network Functions (VNFs). This thesis focuses on the SDN controller and its interaction with the NMS. This project aims at configuring OpenFlow rules in the network using an SDN controller to perform SFC. Moreover, the focus will be on how to represent an SDN element and a service function chain in the legacy network NMS. The thesis also contains a discussion on what information should be exchanged between the management software and the controller. The management software used is called BECS, a system developed by Packetfront Software. Integrating SDN in BECS is done by creating a proof of concept, containing a full environment from the low level network elements to the NMS. By using a bottom-up approach for creating this proof of concept, the information that BECS is required to send to the SDN controller can be identified before designing and implementing the connection between these two entities. When sending the information, the NMS should be able to receive acknowledgement of successful information exchange or an error. However, when the proof of concept was created a problem arose on how to test and troubleshoot it. For this reason, a web Graphical User Interface (GUI) was created. This GUI shows the number of packets that have gone through a VNF. Because it is possible to see how many packets go through a VNF, one can see where a network issue occurs. The subsequent analysis investigates the impact of making such a GUI available for a network administrator and finds that the part of the network where the configuration error occurs can be narrowed down significantly.
Software Defined Networking (SDN) har blivit mer och mer populärt i kombination med Network Function Virtualization (NFV). SDN är en sätt för att göra ett nätverk mer programmerbart och dynamiskt. För att skapa ett homogent nätverk med detta koncept, behöver man dock ersätta traditionell utrustning med SDN utrustning som är dyr. För att stänga gapet mellan traditionella nätverk och SDN-världen, introducerar vi ett koncept med ett traditionell Network Management System (NMS) som är anslutet till en SDN-styrenhet för att utföra tjänsteprovisionering. På detta sätt kan NMS:et konfigurera både traditionella och SDN-nätverk, samt provisionera tjänster för kunderna medan nya SDN-funktioner möjliggörs i SDN-delen av nätverket. Den huvudsakliga tjänsten som vi vill lansera genom SDN är Service Function Chaining (SFC). Tjänsteprovisionering består av att konstruera en väg genom beställda tjänster, i detta fall Virtual Network Functions (VNFs). Detta examensarbete fokuserar huvusakligen på SDN-styrenheten och dess interaktion med NMS:et. Projektet syftar till att konfigurera OpenFlow regler i SDN-styrenheten för att utföra SFC. Dessutom fokuserar arbetet på hur man kan representera SDN-element och SFCs i ett traditionellt NMS. Vidare diskuteras vilken information som ska utbytas mellan NMS:et och SDNstyrenheten. NMS:et som ska vara användas är BECS, ett system utvecklat av Packetfront Software. Uppgiften löses genom att skapa ett proof of concept, som innehåller ett komplett system med alla komponenter från nätverkselement till NMS:et. Genom att använda en bottom-up-strategi för detta proof of concept kan informationen som BECS måste skicka till SDN styrenheten indentifieras, innan design och implementation av förbindelsen mellan enheterna kan utföras. När informationen är skickad ska NMS:et kunna hämta information om huruvida styrenheten fick informationen utan fel. Dock uppstår ett problem gällande hur man testar och felsöker detta proof of concept. Av denna anledning skapades ett web Graphical User Interface (GUI). Användargränssnittet visar antalet paket som går genom varje VNF, samt var i nätverket fel uppstår. Analysen undersöker hur stor effekten är för en nätverkadministrator och visar att området där fel kan uppstå begränsas avsevärt.
APA, Harvard, Vancouver, ISO, and other styles
37

Mittal, Neeraj. "Efficiency enhancing effects of IT investment on other factor inputs and accounting identity approach to value of IT." Connect to this title online, 2004. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1085362216.

Full text
Abstract:
Thesis (Ph. D.)--Ohio State University, 2004.
Title from first page of PDF file. Document formatted into pages; contains xi, 120 p.; also includes graphics. Includes bibliographical references (p. 115-120). Available online via OhioLINK's ETD Center.
APA, Harvard, Vancouver, ISO, and other styles
38

Fenati, Andrea. "Data Locality in Serverless Computing." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/20401/.

Full text
Abstract:
Negli ultimi anni il serverless computing, un nuovo paradigma cloud, ha sperimentato una rapida crescita. Questo modello, chiamato anche Function as a Service (FaaS), permette l’esecuzione di funzioni stateless in risposta ad eventi asincroni. Il suo incremento di popolarità è derivato dalla semplicità di utilizzo. Lo sviluppatore si preoccupa solamente di scrivere il codice delle funzioni e di specificare i requisiti in termini di risorse all’interno della console del provider utilizzato. Tutto il resto, compreso il dimensionamento delle risorse, è gestito in modo automatico dal gestore cloud in base al carico di lavoro richiesto. Inoltre, FaaS offre modalità originali di design e di sviluppo software unite ad una maggior flessibilità nell’uso e nel calcolo dei costi. Questo elaborato è stato inserito in un contesto più ampio, al quale ha partecipato un laureando della Magistrale di Informatica dell’Università di Bologna e due correlatori della University of Southern Denmark. Il progetto, partendo dalla piattaforma serverless open-source Apache OpenWhisk, è volto a dimostrare l’importanza della data locality durante la fase di scheduling delle funzioni. La data locality è importante per ridurre i tempi di esecuzione nel caso in cui le funzioni necessitino di interagire con basi di dati. Come dimostrato in questa tesi, eseguire le cloud functions il più vicino possibile ai dati utilizzati riduce considerevolmente la latenza.
APA, Harvard, Vancouver, ISO, and other styles
39

Van, Wyk Christoffel. "The development of an education management information system from a sensemaking perspective and the application of quantitative methods to analyse education data sets." Thesis, Link to the online version, 2006. http://hdl.handle.net/10019/1276.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

YE, Liu. "A simulation experimental study on the utility of pay changes." Digital Commons @ Lingnan University, 2017. https://commons.ln.edu.hk/cds_etd/17.

Full text
Abstract:
In this thesis, we conduct an experimental simulation of 131 students from a university in Hong Kong and investigate the relationship between pay changes and the perceived values (i.e., utility). Applying traditional psychophysical methods, we measure the utility of pay changes (i.e., pay raises and pay cuts) of different sizes by individual responses (i.e., happiness/unhappiness). Drawing on utility theory and expectancy theory, we examine the function that best fits this relationship by considering common function forms including linear, quadratic, logarithmic, and power functions. Using regression techniques, we find that a quadratic function best fits the data, and the utility function is concave in the pay change. When we examine the best form of utility functions for pay raises and pay cuts separately, we find that the utility of pay raises and that of pay cuts are best described by a quadratic function and a linear function, respectively. We further show that a single model involving all pay changes better describes the utility than two separate models for pay raises and pay cuts. In addition, our best-fit utility model reveals that a sufficiently small amount of pay increase may generate a negative value of utility, and we calculate the percentage of smallest meaningful pay increase that results in non-negative utility. We also discuss the theoretical contributions of our findings to the literature and their implications to practitioners.
APA, Harvard, Vancouver, ISO, and other styles
41

Jiao, Lianmeng. "Classification of uncertain data in the framework of belief functions : nearest-neighbor-based and rule-based approaches." Thesis, Compiègne, 2015. http://www.theses.fr/2015COMP2222/document.

Full text
Abstract:
Dans de nombreux problèmes de classification, les données sont intrinsèquement incertaines. Les données d’apprentissage disponibles peuvent être imprécises, incomplètes, ou même peu fiables. En outre, des connaissances spécialisées partielles qui caractérisent le problème de classification peuvent également être disponibles. Ces différents types d’incertitude posent de grands défis pour la conception de classifieurs. La théorie des fonctions de croyance fournit un cadre rigoureux et élégant pour la représentation et la combinaison d’une grande variété d’informations incertaines. Dans cette thèse, nous utilisons cette théorie pour résoudre les problèmes de classification des données incertaines sur la base de deux approches courantes, à savoir, la méthode des k plus proches voisins (kNN) et la méthode à base de règles.Pour la méthode kNN, une préoccupation est que les données d’apprentissage imprécises dans les régions où les classes de chevauchent peuvent affecter ses performances de manière importante. Une méthode d’édition a été développée dans le cadre de la théorie des fonctions de croyance pour modéliser l’information imprécise apportée par les échantillons dans les régions qui se chevauchent. Une autre considération est que, parfois, seul un ensemble de données d’apprentissage incomplet est disponible, auquel cas les performances de la méthode kNN se dégradent considérablement. Motivé par ce problème, nous avons développé une méthode de fusion efficace pour combiner un ensemble de classifieurs kNN couplés utilisant des métriques couplées apprises localement. Pour la méthode à base de règles, afin d’améliorer sa performance dans les applications complexes, nous étendons la méthode traditionnelle dans le cadre des fonctions de croyance. Nous développons un système de classification fondé sur des règles de croyance pour traiter des informations incertains dans les problèmes de classification complexes. En outre, dans certaines applications, en plus de données d’apprentissage, des connaissances expertes peuvent également être disponibles. Nous avons donc développé un système de classification hybride fondé sur des règles de croyance permettant d’utiliser ces deux types d’information pour la classification
In many classification problems, data are inherently uncertain. The available training data might be imprecise, incomplete, even unreliable. Besides, partial expert knowledge characterizing the classification problem may also be available. These different types of uncertainty bring great challenges to classifier design. The theory of belief functions provides a well-founded and elegant framework to represent and combine a large variety of uncertain information. In this thesis, we use this theory to address the uncertain data classification problems based on two popular approaches, i.e., the k-nearest neighbor rule (kNN) andrule-based classification systems. For the kNN rule, one concern is that the imprecise training data in class over lapping regions may greatly affect its performance. An evidential editing version of the kNNrule was developed based on the theory of belief functions in order to well model the imprecise information for those samples in over lapping regions. Another consideration is that, sometimes, only an incomplete training data set is available, in which case the ideal behaviors of the kNN rule degrade dramatically. Motivated by this problem, we designedan evidential fusion scheme for combining a group of pairwise kNN classifiers developed based on locally learned pairwise distance metrics.For rule-based classification systems, in order to improving their performance in complex applications, we extended the traditional fuzzy rule-based classification system in the framework of belief functions and develop a belief rule-based classification system to address uncertain information in complex classification problems. Further, considering that in some applications, apart from training data collected by sensors, partial expert knowledge can also be available, a hybrid belief rule-based classification system was developed to make use of these two types of information jointly for classification
APA, Harvard, Vancouver, ISO, and other styles
42

Shrivastava, Utkarsh. "Analytics for Novel Consumer Insights (A Three Essay Dissertation)." Scholar Commons, 2018. https://scholarcommons.usf.edu/etd/7711.

Full text
Abstract:
Both literature and practice have investigated how the vast amount of ever increasing customer information can inform marketing strategy and decision making. However, the customer data is often susceptible to modeling bias and misleading findings due to various factors including sample selection and unobservable variables. The available analytics toolkit has continued to develop but in the age of nearly perfect information, the customer decision making has also evolved. The dissertation addresses some of the challenges in deriving valid and useful consumer insights from customer data in the digital age. The first study addresses the limitations of traditional customer purchase measures to account of dynamic temporal variations in the customer purchase history. The study proposes a new approach for representation and summarization of customer purchases to improve promotion forecasts. The method also accounts for sample selection bias that arises due to biased selection of customers for the promotion. The second study investigates the impact of increasing internet penetration on the consumer choices and their response to marketing actions. Using the case study of physician’s drug prescribing, the study identifies how marketers can misallocate resources at the regional level by not accounting for variations in internet penetration. The third paper develops a data driven metric for measuring temporal variations in the brand loyalty. Using a network representation of brand and customer the study also investigates the spillover effects of manufacturer related information shocks on the brand’s loyalty.
APA, Harvard, Vancouver, ISO, and other styles
43

Gruber, Thomas. "Prozessintegrierte Dokumentation und optimierte Wiederverwendung von Simulationsmodellen der automobilen Funktionsabsicherung." Doctoral thesis, Universitätsbibliothek Chemnitz, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-qucosa-202845.

Full text
Abstract:
Die Schaffung, Wahrung und Nutzung von Wissen stellt heute eine wichtige Säule für die Konkurrenzfähigkeit von Unternehmen am Markt dar. Vor diesem Hintergrund steht insbesondere die moderne Funktionsentwicklung der Automobilindustrie vor der Herausforderung immer neue, hochgradig vernetzte Fahrzeugfunktionen zu entwickeln und in immer kürzerer Zeit und immer geringeren Kosten in den Markt zu bringen. Um dieser Herausforderung gerecht zu werden, hat sich die modellbasierte Entwicklung mit dem Ziel der Beherrschung dieser steigenden Komplexität etabliert. Dadurch ist es möglich die Entwicklungsaufgaben auf unterschiedlichen Ebenen zu abstrahieren und eine verteilte, vernetzte Entwicklung zu realisieren. Die Entwicklung einer einzigen Funktion benötigt heute häufig mehrere hundert Personen, die in einen gemeinsamen Entwicklungsprozess integriert werden müssen. Hier fehlt es an Konzepten um den Informations- und Wissensfluss zwischen den Prozessbeteiligten sicherzustellen. In diesem Kontext entwickelt die vorliegende Arbeit einen Ansatz zur prozessintegrierten Dokumentation der in modellbasierten Entwicklungsprozessen benötigten Entwicklungsartefakte. Der Ansatz betrachtet dabei den vollständigen Informationsfluss, von der Definition benötigter Informationen, über deren automatisierte Erfassung und Verarbeitung bis zur zielgerichteten Wiederverwendung. Anschließend skizziert die Arbeit die Architektur eines Informationssystems, dass diese Durchgängigkeit in beliebigen, modellbasierten Entwicklungsprozessen ermöglicht und überträgt diese zur Validierung des Ansatzes auf einen konkreten Entwicklungsprozess der automobilen Funktionsentwicklung. Der Fokus des Ansatzes liegt dabei insbesondere auf der Integration in bestehende Entwicklungsprozesse, ohne in diese verändernd einzugreifen. Dies wird einerseits durch eine modellbasierte Beschreibung des Informationsmodells, mit Methoden wie sie im Funktionsentwicklungsprozess Anwendung finden, erreicht. Prozessbeteiligte können dadurch das Informationsmodell selbst verstehen und bei Bedarf Anpassungen vornehmen, ohne auf geschulte Experten angewiesen zu sein. Andererseits erlaubt der architektonische Ansatz einen direkten Zugriff auf bestehende Entwicklungssysteme und darin enthaltenen dokumentationsrelevanten Informationen
Today, the creation, preservation and exploitation of knowledge represent key factors of the competitiveness of companies in the global market. In this context, the modern function development in the automotive industry faces challenges to bring new, highly interconnected vehicle functions at shorter time and lower cost to the market. To meet these challenges and manage the growing compelity, a model-based development process has been established. Thus, it is possible to distribute development tasks to different levels of abstraction and enable a distributed, interconnected function development.This development involves up to several hundred persons per function, who have to be integrated in a common development process. Especially when it comes to managing the information and knowledge flow between the process participants, there is a lack of concepts to support this communication. Based on this context, this work presents an approach for process integrated documentation of the necessary development artifacts in model-based development processes. This approach considers the complete information flow, from the definition of necessary information, over automatic acquisition and processing to its targeted reuse during the process. Subsequently, this work sketches the architecture of an information system, which enables this continuous approach to be applied to any model-based development process. For validation purposes, the approach is then applied to an actual development process of the automotive function development. The focus of the presented approach lies in the integration in existing development processes without changing them. On the one hand, this is achieved by applying a model-based description of the information model using methods, that can be found in the function development process today. Thus, process participants can understand the information model themselves and apply changes when they are required without the necessity of qualified experts. On the other hand, the architectural approach allows a direct access to existing development systems and documentation relevant information they contain
APA, Harvard, Vancouver, ISO, and other styles
44

Jönsson, Malin, and Lisa Rydhage. "Digitalisering av ekonomifunktionen : En studie på tre fallföretag om konsekvenser avseende relationen mellan processer, hantering av information samt ekonomens förändrade roll." Thesis, Linnéuniversitetet, Institutionen för ekonomistyrning och logistik (ELO), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-86137.

Full text
Abstract:
Bakgrund: Digitaliseringens utveckling har medfört en tilltagande mängd information samt en omfördelning av standardiserade arbetsuppgifter till ett kvalificerat arbete. Dessa konsekvenser har influerat ekonomifunktionen till högre krav på hantering av information och en förändring av ekonomens roll. Det belyses även att digitalisering av ekonomifunktionens standardiserade processer ger upphov till effektivitet men ingen vidare förståelse avseende hur detta influerar de icke-standardiserade processerna och ekonomens roll.    Syfte: Syftet med studien är att beskriva hur digitaliseringens utveckling påverkar ekonomifunktionens processer och därav vilka arbetsuppgifter som är möjliga att digitalisera. Vidare ämnar studien att analysera konsekvenserna, avseende krav på informationshantering samt ekonomens förändrade roll, som uppstår vid digitalisering av ekonomifunktionens processer. Studien avser därmed att utvidga befintlig kunskap genom att förklara relationen mellan processer, hantering av information samt ekonomens förändrade roll.     Metod: För att uppfylla studiens syfte har en fallstudie på tre fallföretag genomförts, där semistrukturerade intervjuer har utgjort grunden för empirisk datainsamling. Intervjuerna har genomförts för att skapa en helhetsuppfattning av digitaliseringens påverkan på ekonomifunktionen.    Slutsats: Det kan konstateras att relationerna mellan ekonomifunktionens processer, hanteringen av information samt ekonomens förändrade roll är avgörande för att uppnå en effektiv ekonomifunktion. Ekonomens analytiska- samt digitala kompetens i kombination med digitala system möjliggör för god informationshantering där värdefull information genereras. Detta främjar till ett större fokus vid de icke-standardiserade processerna och därmed ekonomens förmåga att analysera information och generera ett välgrundat beslutsunderlag.
Background: The development of the digitalization has brought an increasing amount of information as well as a redistribution of standardized activities to a more qualified work. These implications have influenced the accounting function with higher qualification of the information management and an evolution of the role of the economist. It is clear that the digitalization of the accounting function standardized processes has facilitated effectivity. However, there is a lack of understanding about how the non-standardized processes and the role of the economist have been affected by this.    Purpose: The aim of this study is to describe how the development of the digitalization is affecting the processes of the accounting function as well as what tasks that are possible to be digitalized. The aim is also to analyze the implications regarding the qualifications of the information management as well as the evolving role of the economist, that is influenced by the digitalization of the accounting function. The purpose with this study is also to contribute with greater knowledge by explaining the relationship between processes, information management and the evolving role of the economist.   Method: To fulfill the purpose of the study, a case study of three companies has been carried out, where semi-structured interviews have formed the basis for empirical data collection. The interviews have been conducted to create an overall view of the impact of digitization on the accounting function.   Conclusion: It can be concluded that the relations between the processes of the accounting function, the information management and the evolving role of the economist are crucial for achieving an efficient accounting function. The economist’s analytical and digital competence combined with digital systems enable a beneficial information management which create valuable information. This enable greater focus of the non-standardized processes and the economist ability to analyze the information and elaborate a favorable basis for decisions.
APA, Harvard, Vancouver, ISO, and other styles
45

Clausen, Mork Jonas. "Dealing with uncertainty." Doctoral thesis, KTH, Filosofi, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-72680.

Full text
Abstract:
Uncertainty is, it seems, more or less constantly present in our lives. Even so, grasping the concept philosophically is far from trivial. In this doctoral thesis, uncertainty and its conceptual companion information are studied. Axiomatic analyses are provided and numerical measures suggested. In addition to these basic conceptual analyses, the widespread practice of so-called safety factor use in societal regulation is analyzed along with the interplay between science and policy in European regulation of chemicals and construction.
QC 20120202
APA, Harvard, Vancouver, ISO, and other styles
46

Önsari, Burak. "Erfarenhetsåterföring och informationshanteringi en komplex organisation." Thesis, KTH, Hållbar produktionsutveckling (ML), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-253801.

Full text
Abstract:
Målet med examensarbetet är att ta fram en lista på åtgärder vilka är ämnade åt att förbättra dokumenthanteringen och informationslagringen på Arcona, samt förbättra tillgängligheten av relevanta dokument och information.Åtgärderna ska ha en inbördes prioriteringsordning och vilka eventuella krav som ska ställas på förekommande system ska undersökas. Fokus ligger på erfarenhetsåterföring genom dokumentation och informationshantering.Vid genomförandet av projektet tillämpas Quality Function Deployment (kundcentrerad planering). Med utgångspunkten hos produktionsledarrollen har kundönskemål samlats och tolkats för att sedan presentera adekvata förslag på förbättringsåtgärder som uppdragsgivaren kan tillämpa. Resultatet innefattar en lista på åtgärder rangordnat efter föreslagen preliminär prioritering där vi högst upp finner kortfattat bland annat:- Att förstärka medarbetarnas medvetenhet kring rutiner och var de hittar särskildadokument & information.- Ett betygsättningssystem för underentreprenader som anlitas av Arcona AB.- Framtagandet av specifikt listade dokument.- En rutin för erfarenhetsåterföring som är återkommande i olika delar av produktionen.Information om vad varje punkt innefattar återfinns i rapporten.
The aim of the thesis work is to, with the role of the production managers role and function in focus, bring forth a list of suggested actions on how management of documents and information can improve and how relevant documents and information can be made more accessible. The suggested actions shall be arranged in a prioritized manner and the work shall investigate on the eventual requirements of such a system that can manage the documents/data adequately. Focus have been on utilizing experience feedback through documentation and information management. The method QFD (Quality Function Deployment) has been applied during the thesis work.The customer requirements/needs of the co-workers, primarily the productionmanagers, have been collected and interpreted to later on present adequate suggestions of action of improvement that the employer can apply.The result contains a list of actions arranged based on most highly to least highly recommended action to implement. At the top, we find amongst other actions:- To reinforce the co-workers knowledge about routines and whereabouts of specificdocuments & information.- A score-based rating system to be able to rate subcontractors that has been hired byArcona AB.- The production of certain listed documents.- A routine of documenting experience feedback that comes continuously in specific v stages of the production. Information regarding what every action includes is found inside of the thesis.
APA, Harvard, Vancouver, ISO, and other styles
47

Gruber, Thomas. "Prozessintegrierte Dokumentation und optimierte Wiederverwendung von Simulationsmodellen der automobilen Funktionsabsicherung." Universitätsverlag Chemnitz, 2015. https://monarch.qucosa.de/id/qucosa%3A20450.

Full text
Abstract:
Die Schaffung, Wahrung und Nutzung von Wissen stellt heute eine wichtige Säule für die Konkurrenzfähigkeit von Unternehmen am Markt dar. Vor diesem Hintergrund steht insbesondere die moderne Funktionsentwicklung der Automobilindustrie vor der Herausforderung immer neue, hochgradig vernetzte Fahrzeugfunktionen zu entwickeln und in immer kürzerer Zeit und immer geringeren Kosten in den Markt zu bringen. Um dieser Herausforderung gerecht zu werden, hat sich die modellbasierte Entwicklung mit dem Ziel der Beherrschung dieser steigenden Komplexität etabliert. Dadurch ist es möglich die Entwicklungsaufgaben auf unterschiedlichen Ebenen zu abstrahieren und eine verteilte, vernetzte Entwicklung zu realisieren. Die Entwicklung einer einzigen Funktion benötigt heute häufig mehrere hundert Personen, die in einen gemeinsamen Entwicklungsprozess integriert werden müssen. Hier fehlt es an Konzepten um den Informations- und Wissensfluss zwischen den Prozessbeteiligten sicherzustellen. In diesem Kontext entwickelt die vorliegende Arbeit einen Ansatz zur prozessintegrierten Dokumentation der in modellbasierten Entwicklungsprozessen benötigten Entwicklungsartefakte. Der Ansatz betrachtet dabei den vollständigen Informationsfluss, von der Definition benötigter Informationen, über deren automatisierte Erfassung und Verarbeitung bis zur zielgerichteten Wiederverwendung. Anschließend skizziert die Arbeit die Architektur eines Informationssystems, dass diese Durchgängigkeit in beliebigen, modellbasierten Entwicklungsprozessen ermöglicht und überträgt diese zur Validierung des Ansatzes auf einen konkreten Entwicklungsprozess der automobilen Funktionsentwicklung. Der Fokus des Ansatzes liegt dabei insbesondere auf der Integration in bestehende Entwicklungsprozesse, ohne in diese verändernd einzugreifen. Dies wird einerseits durch eine modellbasierte Beschreibung des Informationsmodells, mit Methoden wie sie im Funktionsentwicklungsprozess Anwendung finden, erreicht. Prozessbeteiligte können dadurch das Informationsmodell selbst verstehen und bei Bedarf Anpassungen vornehmen, ohne auf geschulte Experten angewiesen zu sein. Andererseits erlaubt der architektonische Ansatz einen direkten Zugriff auf bestehende Entwicklungssysteme und darin enthaltenen dokumentationsrelevanten Informationen.
Today, the creation, preservation and exploitation of knowledge represent key factors of the competitiveness of companies in the global market. In this context, the modern function development in the automotive industry faces challenges to bring new, highly interconnected vehicle functions at shorter time and lower cost to the market. To meet these challenges and manage the growing compelity, a model-based development process has been established. Thus, it is possible to distribute development tasks to different levels of abstraction and enable a distributed, interconnected function development.This development involves up to several hundred persons per function, who have to be integrated in a common development process. Especially when it comes to managing the information and knowledge flow between the process participants, there is a lack of concepts to support this communication. Based on this context, this work presents an approach for process integrated documentation of the necessary development artifacts in model-based development processes. This approach considers the complete information flow, from the definition of necessary information, over automatic acquisition and processing to its targeted reuse during the process. Subsequently, this work sketches the architecture of an information system, which enables this continuous approach to be applied to any model-based development process. For validation purposes, the approach is then applied to an actual development process of the automotive function development. The focus of the presented approach lies in the integration in existing development processes without changing them. On the one hand, this is achieved by applying a model-based description of the information model using methods, that can be found in the function development process today. Thus, process participants can understand the information model themselves and apply changes when they are required without the necessity of qualified experts. On the other hand, the architectural approach allows a direct access to existing development systems and documentation relevant information they contain.
APA, Harvard, Vancouver, ISO, and other styles
48

Hunt, Julian David. "Integration of rationale management with multi-criteria decision analysis, probabilistic forecasting and semantics : application to the UK energy sector." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:2cc24d23-3e93-42e0-bb7a-6e39a65d7425.

Full text
Abstract:
This thesis presents a new integrated tool and decision support framework to approach complex problems resulting from the interaction of many multi-criteria issues. The framework is embedded in an integrated tool called OUTDO (Oxford University Tool for Decision Organisation). OUTDO integrates Multi-Criteria Decision Analysis (MCDA), decision rationale management with a modified Issue-Based Information Systems (IBIS) representation, and probabilistic forecasting to effectively capture the essential reasons why decisions are made and to dynamically re-use the rationale. In doing so, it allows exploration of how changes in external parameters affect complicated and uncertain decision making processes in the present and in the future. Once the decision maker constructs his or her own decision process, OUTDO checks if the decision process is consistent and coherent and looks for possible ways to improve it using three new semantic-based decision support approaches. For this reason, two ontologies (the Decision Ontology and the Energy Ontology) were integrated into OUTDO to provide it with these semantic capabilities. The Decision Ontology keeps a record of the decision rationale extracted from OUTDO and the Energy Ontology describes the energy generation domain, focusing on the water requirement in thermoelectric power plants. A case study, with the objective of recommending electricity generation and steam condensation technologies for ten different regions in the UK, is used to verify OUTDO’s features and reach conclusions about the overall work.
APA, Harvard, Vancouver, ISO, and other styles
49

Rastogi, Rahul. "Information security service management : a service management approach to information security management." Thesis, Nelson Mandela Metropolitan University, 2011. http://hdl.handle.net/10948/1389.

Full text
Abstract:
In today’s world, information and the associated Information Technology are critical assets for many organizations. Any information security breach, or compromise of these assets, can lead to serious implications for organizations that are heavily dependent on these assets. For such organizations, information security becomes vital. Organizations deploy an information security infrastructure for protecting their information assets. This infrastructure consists of policies and controls. Organizations also create an information security management system for managing information security in the organization. While some of the policies and controls are of a purely technical nature, many depend upon the actions of end-users. However, end-users are known to exhibit both compliant and noncompliant behaviours in respect of these information security policies and controls in the organization. Non-compliant information security behaviours of end-users have the potential to lead to information security breaches. Non-compliance thus needs to be controlled. The discipline of information security and its management have evolved over the years. However, the discipline has retained the technology-driven nature of its origin. In this context, the discipline has failed to adequately appreciate the role played by the end-users and the complexities of their behaviour, as it relates to information security policies and controls. The pervasive information security management philosophy is that of treating end-users as the enemy. Compliance is sought to be achieved through awareness programs, rewards, punishments and evermore strict policies and controls. This has led to a bureaucratic information security management approach. The philosophy of treating end-users as the enemy has had an adverse impact on information security in the organization. It can be said that rather than curbing non-compliance by end-users, the present-day bureaucratic approach to information security management has contributed to non-compliance. This thesis calls this the end-user crisis. This research aims at resolving this crisis by identifying an improved approach to information security management in the organization. This research has applied the service management approach to information security management. The resultant Information Security Service Management (ISSM) views end-users as assets and resources, and not as enemies. The central idea of ISSM is that the end-user is to be treated as a customer, whose needs are to be satisfied. This research presents ISSM. This research also presents the various components of ISSM to aid in its implementation in an organization.
APA, Harvard, Vancouver, ISO, and other styles
50

Rauschmayer, Axel. "Connected Information Management." Diss., lmu, 2010. http://nbn-resolving.de/urn:nbn:de:bvb:19-114390.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography