To see the other types of publications on this topic, follow the link: Complexity science framework.

Dissertations / Theses on the topic 'Complexity science framework'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 15 dissertations / theses for your research on the topic 'Complexity science framework.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Papastefano, N., and der Walt SE Arnoldi-van. "A complexity science-based management framework for virtual organisations." African Journal of Business Management, 2010. http://encore.tut.ac.za/iii/cpro/DigitalItemViewPage.external?sp=1001182.

Full text
Abstract:
Abstract The virtual organisation challenges traditional management assumptions because a new means of coordinating globally dispersed employees is needed. To understand the collective activities of a workforce separated by space and time, this paper describes a complexity science-based management framework for virtual organisations. Specific focus is on a South African virtual organisation as a complex adaptive system. A single, embedded case study strategy was followed, and multiple data sources used to generate theory. In this paper, results are reported that clarify the management of an organisation where technology replaces conventional face-toface contexts for socialisation and assimilation. The paper shows how managers create a virtual context for sharing meaning and interaction through synergy, empowerment, participation and an accountable, committed workforce.
APA, Harvard, Vancouver, ISO, and other styles
2

Ooi, James M. 1970. "A framework for low-complexity communication over channels with feedback." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/10050.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1998.
Includes bibliographical references (p. 181-185).
by James Meng-Hsien Ooi.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
3

Stockton, Imogen. "Organisational resilience within a complexity science framework : A case study of Ballarat City Council." Thesis, Federation University of Ballarat, 2016. http://researchonline.federation.edu.au/vital/access/HandleResolver/1959.17/154227.

Full text
Abstract:
Understanding the resilience of organisations, their vulnerabilities and capacity to adapt to an unknown future is critical because modern society is dependent upon the continuation of these systems or alternative systems which support humans, their communities and the environment. The challenge for organisations assessing their resilience is to find a way to undertake this assessment that best meets the needs of the organisation and the context in which it operates. Thus this study aims to develop an understanding of resilience, in particular, organisational resilience and develop a means of identifying resilience in an organisation. A conceptual model of organisational resilience was developed together with an operational Framework of Analysis which was then applied to the Ballarat City Council as a single case study. The conceptual model proposes that resilience is a state of being, that is a proximity to the edge of chaos, where the connections between agents within a system are most flexible. The absence of rigid, inflexible connections enables agents within a complex adaptive system to innovate, co-evolve and adapt to changing circumstances. This is achieved by having an awareness of the fitness landscape, having the flexibility to manage vulnerabilities and being able to adapt. Coevolution, adaptation and creativity occur most readily from close proximity to the edge of chaos. Using a Critical Realist approach, the Ballarat City Council case study evaluates the Framework of Analysis. Data collection occurred over a six month period with primary sources of data being an organisational document analysis, interviews and an infrastructure assessment. The results indicate that situational awareness, the identification and management of keystone vulnerabilities and an increase in adaptive capacity act as mechanisms of adaptation and are integral to an organisation achieving a position of resilience. This research presents a new perspective to the concept of resilience, in which resilience is a position relative to the edge of chaos, rather than a process or set of characteristics.
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
4

Chan, Albert M. (Albert Michael) 1975. "A framework for low-complexity iterative interference cancellation in communication systems." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/28537.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2004.
Includes bibliographical references (p. 211-215).
Communication over interference channels poses challenges not present for the more traditional additive white Gaussian noise (AWGN) channels. In order to approach the information limits of an interference channel, interference mitigation techniques need to be integrated with channel coding and decoding techniques. This thesis develops such practical schemes when the transmitter has no knowledge of the channel. The interference channel model we use is described by r = Hx + w, where r is the received vector, H is an interference matrix, x is the transmitted vector of data symbols chosen from a finite set, and w is a noise vector. The objective at the receiver is to detect the most likely vector x that was transmitted based on knowledge of r, H, and the statistics of w. Communication contexts in which this general integer programming problem appears include the equalization of intersymbol interference (ISI) channels, the cancellation of multiple-access interference (MAI) in code-division multiple-access (CDMA) systems, and the decoding of multiple-input multiple-output (MIMO) systems in fading environments. We begin by introducing mode-interleaved precoding, a transmitter preceding technique that conditions an interference channel so that the pairwise error probability of any two transmit vectors becomes asymptotically equal to the pairwise error probability of the same vectors over an AWGN channel at the same signal-to-noise ratio (SNR). While mode-interleaved precoding dramatically increases the complexity of exact ML detection, we develop iterated-decision detection to mitigate this complexity problem. Iterated-decision detectors use optimized multipass algorithms to successively cancel interference from r and generate symbol
(cont.) decisions whose reliability increases monotonically with each iteration. When used in uncoded systems with mode-interleaved preceding, iterated-decision detectors asyrmptotically achieve the performance of ML detection (and thus the interference-free lower bound) with considerably lower complexity. We interpret these detectors as low-complexity approximations to message-passing algorithms. The integration of iterated-decision detectors into communication systems with coding is also developed to approach information rates close to theoretical limits. We present joint detection and decoding algorithms based on the iterated-decision detector with mode-interleaved precoding, and also develop analytic tools to predict the behavior of such systems. We discuss the use of binary codes for channels that support low information rates, and multilevel codes and lattice codes for channels that support higher information rates.
by Albert M. Chan.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
5

Tisdale, Susan M. "Architecting a Cybersecurity Management Framework| Navigating and Traversing Complexity, Ambiguity, and Agility." Thesis, Robert Morris University, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10825513.

Full text
Abstract:

Despite advancements in technology, countermeasure, and situational awareness, cybersecurity (CS) breaches continue to increase in number, complexity, and severity. This qualitative study is one of a few to comprehensively explore CS management. The study used a systems’ approach to identify business, socioeconomic, and information technology (IT) factors, and their interrelationships. The study examined IT management frameworks and CS standards and literature. Interviews and a focus group of subject matter experts followed. The research found CS is a leadership, not a technical issue. CS is an ecosystem; its components are interrelated and inseparable, requiring qualitative, subjective, risk and knowledge management interventions. CS, IT, and threats are too complex and volatile for organizations to manage all risks and vulnerabilities in a timely, agile manner. CS lexicons lack uniformity and consistency. An IT management framework is better suited for CS. Companies must segregate and encrypt the most sensitive information and curb their appetites for new, unsecured technology. CS and IT is multilayered, requiring subspecialists, who often serve conflicting business needs and security objectives. Organizations need to minimize mid-level CS management, raise CS to a business level function (not subordinate to IT), and involve cyber specialists at all levels in the business lifecycle. Cross-pollinating people from all business areas, especially from finance, CS, and IT, increases awareness of the others’ responsibilities and obligations and facilitates more rapid portfolio, lifecycle CS activities, from investments to detection and response activities. Future studies should focus on these issues as critical success factors. Finally, the study of CS requires agile, qualitative, multidisciplinary methodology to produce thick, quick, actionable information.

APA, Harvard, Vancouver, ISO, and other styles
6

Grotsky, Dan Moshe 1971. "A new framework for making sourcing decisions regarding low-volume, high-complexity products." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/34719.

Full text
Abstract:
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management; and, (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science; in conjunction with the Leaders for Manufacturing Program at MIT, February 2002.
Includes bibliographical references (leaves 49-50).
Compaq Computer Corporation's High Performance Systems Business Unit (HPSBU) manufactures a series of high-end computer servers called Alpha Servers. These servers are manufactured in relatively low volumes, typically for large institutions that require complex computer systems - either rapid number processing, as in scientific applications, or massive data processing, as in large database applications. They are mostly custom-configured for each customer, each server specifically assembled, and each system specifically configured to meet each customer's needs. As computer manufacturing processes become more standardized, and computers almost commoditized, it becomes impractical to manufacture all system components in-house. To that extent, Compaq has gradually outsourced more and more of the functions, which, combined, are necessary to deliver finished product to Compaq's Alpha Server customers. For instance, as computer manufacturing technology progressed, it quickly became evident, that keyboard manufacturing can, and should, be outsourced to a contract manufacturer, which can achieve economies of scale and produce large quantities of standard keyboards at minimal cost. On the other extreme, Compaq has made sure to keep most of its core competencies in-house, in order to preserve its competitive advantage. The key question faced by Compaq today is which functions to preserve in-house, and which to outsource. A new conceptual model for making this make or buy decision is presented. The purpose of this model is to raise the numerous issues at stake when considering outsourcing of a particular function, specifically when dealing with low-volume, high-complexity products, such as the Alpha Server. This model provides Compaq with a structured method of analyzing the various components that make up the finished product delivered to the customer, and deciding which need to be maintained in-house, which should be outsourced, and which of those can be outsourced. Initial model implementation was performed on the latest Alpha Server product family, dubbed Miracle for the purpose of this document.
by Dan Moshe Grotsky.
S.M.
M.B.A.
APA, Harvard, Vancouver, ISO, and other styles
7

Cintron, Jose. "A Framework for Measuring the Value-Added of Knowledge Processes with Analysis of Process Interactions and Dynamics." Doctoral diss., University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5917.

Full text
Abstract:
The most known and widely used methods use cash flows and tangible assets to measure the impact of investments in the organization's outputs. But in the last decade many newer organizations whose outputs are heavily dependent on information technology utilize knowledge as their main asset. These organizations' market values lie on the knowledge of its employees and their technological capabilities. In the current technology-based business landscape the value added by assets utilized for generation of outputs cannot be appropriately measured and managed without considering the role that intangible assets and knowledge play in executing processes. The analysis of processes for comparison and decision making based on intangible value added can be accomplished using the knowledge required to execute processes. The measurement of value added by knowledge can provide a more realistic framework for analysis of processes where traditional cost methods are not appropriate, enabling managers to better allocate and control knowledge-based processes. Further consideration of interactions and complexity between proposed process alternatives can yield answers about where and when investments can improve value-added while dynamically providing higher returns on investment.
Ph.D.
Doctorate
Industrial Engineering and Management Systems
Engineering and Computer Science
Industrial Engineering
APA, Harvard, Vancouver, ISO, and other styles
8

Gustafsson, Adam. "An Analysis of Platform Game Design : Implementation Categories and Complexity Measurements." Thesis, Linnéuniversitetet, Institutionen för datavetenskap (DV), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-35517.

Full text
Abstract:
This thesis addresses design and development associated problems identified within theplatform-game genre. The problem described originates from the fluctuating curve ofinterest towards the platform-game genre that can be observed between the 1980’s andtoday. The problem stated in this thesis is that modern platform-game developers mayoften overlook and –or deprioritize important design and gameplay related componentsthat we find reoccurring in previously popular games within the genre.This thesis strives to address such problems by decomposing the developmentprocess of a platform game into a light framework titled Implementation categories. Allincluded categories represent a set of design and development related platform-gamecomponents – primarily identified through previous research in the field. In order tocreate an understanding of each category’s complexity - as well as account for thepossibilities to use the categories as a guideline when developing a platform game - aprototype game was developed. The Implementation categories included in theprototype was then measured with a set of software complexity metrics. This thesis willmotivate and explain the selection of implementation categories, account for the usageof software complexity metrics as well as present a detailed documentation of theprototype development.The result of this thesis is a thorough presentation of the Implementation categories -attached with complexity examples for each category as well as a complete gameprototype. The complete results of this thesis will hopefully be of assistance in smallscale,independent or academic game projects in regard of design, decision making,prioritization and time planning.
APA, Harvard, Vancouver, ISO, and other styles
9

Yeow, Pamela. "Individual and organisational change management strategies : a proposed framework drawn from comparative studies in complexity theory and models of stress and well-being." Thesis, University of Sheffield, 2000. http://kar.kent.ac.uk/25817/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Mousavi, Seyedamirhossein. "Maintainability Evaluation of Single Page Application Frameworks : Angular2 vs. React." Thesis, Linnéuniversitetet, Institutionen för datavetenskap (DV), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-60901.

Full text
Abstract:
Web applications are subject to intense market forces, fast delivery and rapid requirement and code change. These are the factors that make maintainability a significant concern in any and especially web application development. In this report we develop a functional equivalent prototype from an existing Angular app, using ReactJs and afterward compare their maintainability as defined by ISO/IEC 25010. The maintainability comparison is made by calculating maintainability index for each of the applications using Plato analysis tool. The results do not show a significant difference in the calculated value of the final products. Source code analysis shows that changes in data flow need more modification in the Angular app, but with the objective oriented approach provided by Angular, we can have smaller chunks of code and thus higher maintainability per file and respectively a better average value. We conclude that regarding the lack of research and models in this area, MI is a consistent measurement model and Plato is a suitable tool for analysis. Though maintainability is highly bounded to the implementation, functionalities which are provided by the Angular framework as a bundle is more appropriate for large enterprises and complex products where React works better for smaller products.
APA, Harvard, Vancouver, ISO, and other styles
11

Rosengren, Anna, Elsayed Mohamed Maher, and Niklas Eklund. "Corporate leadership development programs towards sustainability." Thesis, Blekinge Tekniska Högskola, Institutionen för strategisk hållbar utveckling, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-14780.

Full text
Abstract:
With the increasing level of complexity that leaders face today, represented in the accelerating pace of technology advancement and globalization, along with the climate change indicators reaching unprecedented levels, the need for good leadership quality has become more crucial than ever. The Framework for Strategic Sustainable Development provides a systems perspective, a principle-based definition and a way to strategically move towards sustainability, however still there is a need to specify what is required for leaders to lead organizations through this process. The aim of the thesis is to explore how corporate leadership development companies can develop the essential leadership competencies to address the sustainability challenge. The study used the Key Competences in Sustainability Framework as a base to interview six leadership development companies from different areas in the world. The findings revealed that there is an essential need for self-development for leaders to handle complexity, as well as the need from leaders to create the proper conditions for their organizations to utilize the competences from the KCSF. Furthermore the results also showcased the need for standard common definition regarding sustainability.
APA, Harvard, Vancouver, ISO, and other styles
12

Glaudert, Nathalie. "La complexité linguistique : essai de théorisation et d'application dans un cadre comparatiste." Phd thesis, Université de la Réunion, 2011. http://tel.archives-ouvertes.fr/tel-00716874.

Full text
Abstract:
Cette thèse en linguistique théorique s'inscrit dans un cadre comparatiste. La première partie de notre thèse est un essai de théorisation de la mesure de la complexité linguistique. Nous y proposons une redéfinition de la théorie de la marque, socle de notre recherche transversale, qui prend en compte (1) les différentes définitions qu'elle a reçues au cours de son développement, (2) les apports que peuvent représenter d'autres modèles théoriques et (3) les critiques qui lui ont été faites jusqu'à notre présente étude. La seconde partie de notre thèse est un essai d'application de la théorie de la marque qui a pour objectif de tester son degré de validité dans plusieurs composantes du langage et dans des analyses intra- et intersystémiques de quelques langues indo-européennes et de l'océan Indien. Il s'agit aussi de cerner ses limites et de présenter les principes fonctionnels avec lesquels elle est en concurrence.
APA, Harvard, Vancouver, ISO, and other styles
13

Emanuelsson, Kristoffer. "Approximating multi-commodity max-flow in practice." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-184193.

Full text
Abstract:
Garg and Könemann developed a framework for computing multi-commodity maximum flow in a graph, later called a multiplicative weight update framework. Madry used this framework and exchanged Dijkstra’s algorithm to a dynamic graph algorithm for approximating the shortest paths through the graph. With this approachhe developed the fastest algorithm to date for calculating the multi-commodity maximum flow, with a running time of Õ(mnϵ2). This project have implemented the algorithm and compared it with a slightly modified version of the former fastest algorithm by Fleischer with a time complexity of Õ(m2ϵ2). The results show that Madry’s algorithms is slower than Fleischer’s algorithm in practice for graph with less than 100 million edges. This project also computed the space needed for the dynamic algorithms used in Madry’s algorithm and can show a resulting space complexity of O(n(n+m)log2n), compared to the space complexity of Fleischer’s algorithm of O(n). For a graph with 100 million edges, 50 million Gb of space is needed to use Madry’s algorithm, which is more than our test computers had. We can therefore conclude that Madry’s algorithm is not usable in real life today, both in terms of memory usage and time consumption.
Garg and Könemann utvecklade ett framework för att beräkna multi-commodity maximum flöde i en graf sedan kallat ett multiplicative weight update framework. Madry använde detta framework och bytte ut Dijkstra’s algoritm mot en dynamisk grafalgoritm för att kunna approximera kortaste vägen i grafen. Med detta angeppssätt utvecklade han den idag snabbaste algoritmen för beräkning av multicommodity maximum flöde med en tids komplexitet på Õ(mnϵ2). Det här projektet har implementerat hans algoritm och jämfört den med den tidigare snabbaste algoritmen skapad av Fleischer med en tidskomplexitet på Õ(m2ϵ2). Resultatet visar att Madrys algoritm är långsammare än Fleischers algoritm i praktiken för grafer med färre än 100 miljoner kanter. Detta projekt beräknade också minnesåtgången för de dynamiska algoritmerna i Madrys algorithm och kan visa en resulterade minneskomplexitet på O(n(n+m)log2n), jämfört med Fleischers algoritm på O(n). För en graf med 100 miljoner kanter så behövs 50 miljoner Gb av minne för att kunna använda Madrys algoritm, vilket var mer än våra testdatorer hade. Vi kan därför konstatera att Madrys algoritm inte är användbar i praktiken idag, både när det kommer till minnesanvändning och till tidsåtgång.
APA, Harvard, Vancouver, ISO, and other styles
14

Palix, Nicolas, Julia L. Lawall, Gaël Thomas, and Gilles Muller. "How Often do Experts Make Mistakes?" Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4132/.

Full text
Abstract:
Large open-source software projects involve developers with a wide variety of backgrounds and expertise. Such software projects furthermore include many internal APIs that developers must understand and use properly. According to the intended purpose of these APIs, they are more or less frequently used, and used by developers with more or less expertise. In this paper, we study the impact of usage patterns and developer expertise on the rate of defects occurring in the use of internal APIs. For this preliminary study, we focus on memory management APIs in the Linux kernel, as the use of these has been shown to be highly error prone in previous work. We study defect rates and developer expertise, to consider e.g., whether widely used APIs are more defect prone because they are used by less experienced developers, or whether defects in widely used APIs are more likely to be fixed.
APA, Harvard, Vancouver, ISO, and other styles
15

"Charting Caregiver Movement Using a Complexity Science Framework: An Emergent Perspective." Doctoral diss., 2013. http://hdl.handle.net/2286/R.I.18750.

Full text
Abstract:
abstract: Health and healing in the United States is in a moment of deep and broad transformation. Underpinning this transformation is a shift in focus from practitioner- and system-centric perspectives to patient and family expectations and their accompanying localized narratives. Situated within this transformation are patients and families of all kinds. This shift's interpretation lies in the converging and diverging trails of biomedicine, a patient-centric perspective of consensus between practitioner and patient, and postmodern philosophy, a break from prevailing norms and systems. Lending context is the dynamic interplay between increasing ethnic/cultural diversity, acculturation/biculturalism, and medical pluralism. Diverse populations continue to navigate multiple health and healing paradigms, engage in the process of their integration, and use health and healing practices that run corollary to them. The way this experience is viewed, whether biomedically or philosophically, has implications for the future of healthcare. Over this fluid interpenetration, with its vivid nuance, loom widespread health disparities. The adverse effects of static, fragmented healthcare systems unable to identify and answer diverse populations' emergent needs are acutely felt by these individuals. Eradication of health disparities is born from insight into how these populations experience health and healing. The resulting strategy must be one that simultaneously addresses the complex intricacies of patient-centered care, permits emergence of more localized narratives, and eschews systems that are no longer effective. It is the movement of caregivers across multiple health and healing sources, managing care for loved ones, that provides this insight and in which this project is keenly interested. Uncovering the emergent patterns of caregivers' management of these sources reveals a rich and nuanced spectrum of realities. These realities are replete with opportunities to re-frame health and healing in ways that better reflect what these diverse populations of caregivers and care recipients need. Engaging female Mexican American caregivers, a population whose experience is well-suited to aid in this re-frame, this project begins to provide that insight. Informed by a parent framework of Complexity Science, and balanced between biomedical and postmodern perspectives, this constructivist grounded theory secondary analysis charts these caregivers' processes and offers provocative findings and recommendations for understanding their experiences.
Dissertation/Thesis
Ph.D. Healthcare Innovation 2013
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography