To see the other types of publications on this topic, follow the link: Black-box Analysis.

Dissertations / Theses on the topic 'Black-box Analysis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 40 dissertations / theses for your research on the topic 'Black-box Analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Arlt, Stephan [Verfasser], and Andreas [Akademischer Betreuer] Podelski. "Program analysis and black-box GUI testing = Program Analysis und Black-box GUI Testing." Freiburg : Universität, 2014. http://d-nb.info/1123479232/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mohammadi, Hossein. "Kriging-based black-box global optimization : analysis and new algorithms." Thesis, Lyon, 2016. http://www.theses.fr/2016LYSEM005/document.

Full text
Abstract:
L’«Efficient Global Optimization» (EGO) est une méthode de référence pour l’optimisation globale de fonctions «boites noires» coûteuses. Elle peut cependant rencontrer quelques difficultés, comme le mauvais conditionnement des matrices de covariance des processus Gaussiens (GP) qu’elle utilise, ou encore la lenteur de sa convergence vers l’optimum global. De plus, le choix des paramètres du GP, crucial car il contrôle la famille des fonctions d’approximation utilisées, mériterait une étude plus poussée que celle qui en a été faite jusqu’à présent. Enfin, on peut se demander si l’évaluation classique des paramètres du GP est la plus appropriée à des fins d’optimisation. \\Ce travail est consacré à l'analyse et au traitement des différentes questions soulevées ci-dessus.La première partie de cette thèse contribue à une meilleure compréhension théorique et pratique de l’impact des stratégies de régularisation des processus Gaussiens, développe une nouvelle technique de régularisation, et propose des règles pratiques. Une seconde partie présente un nouvel algorithme combinant EGO et CMA-ES (ce dernier étant un algorithme d’optimisation globale et convergeant). Le nouvel algorithme, nommé EGO-CMA, utilise EGO pour une exploration initiale, puis CMA-ES pour une convergence finale. EGO-CMA améliore les performances des deux algorithmes pris séparément. Dans une troisième partie, l’effet des paramètres du processus Gaussien sur les performances de EGO est soigneusement analysé. Finalement, un nouvel algorithme EGO auto-adaptatif est présenté, dans une nouvelle approche où ces paramètres sont estimés à partir de leur influence sur l’efficacité de l’optimisation elle-même<br>The Efficient Global Optimization (EGO) is regarded as the state-of-the-art algorithm for global optimization of costly black-box functions. Nevertheless, the method has some difficulties such as the ill-conditioning of the GP covariance matrix and the slow convergence to the global optimum. The choice of the parameters of the GP is critical as it controls the functional family of surrogates used by EGO. The effect of different parameters on the performance of EGO needs further investigation. Finally, it is not clear that the way the GP is learned from data points in EGO is the most appropriate in the context of optimization. This work deals with the analysis and the treatment of these different issues. Firstly, this dissertation contributes to a better theoretical and practical understanding of the impact of regularization strategies on GPs and presents a new regularization approach based on distribution-wise GP. Moreover, practical guidelines for choosing a regularization strategy in GP regression are given. Secondly, a new optimization algorithm is introduced that combines EGO and CMA-ES which is a global but converging search. The new algorithm, called EGO-CMA, uses EGO for early exploration and then CMA-ES for final convergence. EGO-CMA improves the performance of both EGO and CMA-ES. Thirdly, the effect of GP parameters on the EGO performance is carefully analyzed. This analysis allows a deeper understanding of the influence of these parameters on the EGO iterates. Finally, a new self-adaptive EGO is presented. With the self-adaptive EGO, we introduce a novel approach for learning parameters directly from their contribution to the optimization
APA, Harvard, Vancouver, ISO, and other styles
3

Atamna, Asma. "Analysis of Randomized Adaptive Algorithms for Black-Box Continuous Constrained Optimization." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS010/document.

Full text
Abstract:
On s'intéresse à l'étude d'algorithmes stochastiques pour l'optimisation numérique boîte-noire. Dans la première partie de cette thèse, on présente une méthodologie pour évaluer efficacement des stratégies d'adaptation du step-size dans le cas de l'optimisation boîte-noire sans contraintes. Le step-size est un paramètre important dans les algorithmes évolutionnaires tels que les stratégies d'évolution; il contrôle la diversité de la population et, de ce fait, joue un rôle déterminant dans la convergence de l'algorithme. On présente aussi les résultats empiriques de la comparaison de trois méthodes d'adaptation du step-size. Ces algorithmes sont testés sur le testbed BBOB (black-box optimization benchmarking) de la plateforme COCO (comparing continuous optimisers). Dans la deuxième partie de cette thèse, sont présentées nos contributions dans le domaine de l'optimisation boîte-noire avec contraintes. On analyse la convergence linéaire d'algorithmes stochastiques adaptatifs pour l'optimisation sous contraintes dans le cas de contraintes linéaires, gérées avec une approche Lagrangien augmenté adaptative. Pour ce faire, on étend l'analyse par chaines de Markov faite dans le cas d'optimisation sans contraintes au cas avec contraintes: pour chaque algorithme étudié, on exhibe une classe de fonctions pour laquelle il existe une chaine de Markov homogène telle que la stabilité de cette dernière implique la convergence linéaire de l'algorithme. La convergence linéaire est déduite en appliquant une loi des grands nombres pour les chaines de Markov, sous l'hypothèse de la stabilité. Dans notre cas, la stabilité est validée empiriquement<br>We investigate various aspects of adaptive randomized (or stochastic) algorithms for both constrained and unconstrained black-box continuous optimization. The first part of this thesis focuses on step-size adaptation in unconstrained optimization. We first present a methodology for assessing efficiently a step-size adaptation mechanism that consists in testing a given algorithm on a minimal set of functions, each reflecting a particular difficulty that an efficient step-size adaptation algorithm should overcome. We then benchmark two step-size adaptation mechanisms on the well-known BBOB noiseless testbed and compare their performance to the one of the state-of-the-art evolution strategy (ES), CMA-ES, with cumulative step-size adaptation. In the second part of this thesis, we investigate linear convergence of a (1 + 1)-ES and a general step-size adaptive randomized algorithm on a linearly constrained optimization problem, where an adaptive augmented Lagrangian approach is used to handle the constraints. To that end, we extend the Markov chain approach used to analyze randomized algorithms for unconstrained optimization to the constrained case. We prove that when the augmented Lagrangian associated to the problem, centered at the optimum and the corresponding Lagrange multipliers, is positive homogeneous of degree 2, then for algorithms enjoying some invariance properties, there exists an underlying homogeneous Markov chain whose stability (typically positivity and Harris-recurrence) leads to linear convergence to both the optimum and the corresponding Lagrange multipliers. We deduce linear convergence under the aforementioned stability assumptions by applying a law of large numbers for Markov chains. We also present a general framework to design an augmented-Lagrangian-based adaptive randomized algorithm for constrained optimization, from an adaptive randomized algorithm for unconstrained optimization
APA, Harvard, Vancouver, ISO, and other styles
4

ROSSIGNOLI, DOMENICO. "DEMOCRACY, INSTITUTIONS AND GROWTH: EXPLORING THE BLACK BOX." Doctoral thesis, Università Cattolica del Sacro Cuore, 2013. http://hdl.handle.net/10280/1870.

Full text
Abstract:
La letteratura economica e politologica evidenzia un ampio consenso sull’esistenza di un effetto positivo sulla crescita di lungo periodo da parte di diritti di proprietà, stato di diritto e, in generale, istituzioni economiche. Contestualmente, il rapporto tra democrazia e crescita rimane teoricamente poco chiaro mentre l'evidenza empirica è in gran parte inconcludente. Questo studio cerca di riconciliare i fatti stilizzati su crescita e democrazia qui evidenziati, che dimostrano l'esistenza di un "successo sinergico" negli ultimi trent'anni, con la teoria esistente e l’evidenza empirica. Dopo aver dettagliatamente scandagliato la letteratura esistente, questo studio suggerisce che l’effetto della democrazia sulla crescita di lungo periodo sia indiretto, mediato dalle istituzioni. Per testare questa ipotesi si propone un modello di analisi originale, applicato ad un panel di 194 paesi osservati nel periodo 1961-2010, utilizzando lo stimatore System-GMM e una vasta gamma di controlli. I risultati dell’analisi suggeriscono che la democrazia è positivamente correlata a istituzioni “più favorevoli” alla crescita economica, in particolare diritti di proprietà e stato di diritto. Inoltre, l’evidenza empirica supporta la tesi di un effetto indiretto complessivamente positivo della democrazia sulla crescita. Infine, si propone uno sviluppo ulteriore dell’analisi, concentrato sulle determinanti della democrazia, ricercando possibili concause nell’interazione con i processi economici.<br>Economic and political science literature show a wide consensus about the positive effect of property rights, contract enforcing arrangements and, more generally, economic institutions to long-run growth. Conversely, the linkage between democracy and growth remains unclear and not conclusively supported by empirical research. This work is an attempt to reconcile the stylized facts about democracy and growth –evidencing a long-run “synergic success” between the two terms – with theoretical and empirical literature. After thoroughly surveying the relevant literature on the topic, this study claims that the effect of democracy on long-run growth is indirect, channeled by the means of institutions. To test this hypothesis, the thesis provides an original analytical framework which is applied to a panel of 194 countries over the period 1961-2010, adopting a System-GMM estimation technique and a wide range of robustness controls. The results suggest that democracy is positively related to “better” (namely more growth-enhancing) institutions, especially with respect to economic institutions and rule of law. Hence, the findings suggest that the overall effect on growth is positive, indirect and channeled by institutions. However, since the results are not completely conclusive, a further investigation is suggested, on further determinants of democracy, potentially affecting its pro-growth effect.
APA, Harvard, Vancouver, ISO, and other styles
5

ROSSIGNOLI, DOMENICO. "DEMOCRACY, INSTITUTIONS AND GROWTH: EXPLORING THE BLACK BOX." Doctoral thesis, Università Cattolica del Sacro Cuore, 2013. http://hdl.handle.net/10280/1870.

Full text
Abstract:
La letteratura economica e politologica evidenzia un ampio consenso sull’esistenza di un effetto positivo sulla crescita di lungo periodo da parte di diritti di proprietà, stato di diritto e, in generale, istituzioni economiche. Contestualmente, il rapporto tra democrazia e crescita rimane teoricamente poco chiaro mentre l'evidenza empirica è in gran parte inconcludente. Questo studio cerca di riconciliare i fatti stilizzati su crescita e democrazia qui evidenziati, che dimostrano l'esistenza di un "successo sinergico" negli ultimi trent'anni, con la teoria esistente e l’evidenza empirica. Dopo aver dettagliatamente scandagliato la letteratura esistente, questo studio suggerisce che l’effetto della democrazia sulla crescita di lungo periodo sia indiretto, mediato dalle istituzioni. Per testare questa ipotesi si propone un modello di analisi originale, applicato ad un panel di 194 paesi osservati nel periodo 1961-2010, utilizzando lo stimatore System-GMM e una vasta gamma di controlli. I risultati dell’analisi suggeriscono che la democrazia è positivamente correlata a istituzioni “più favorevoli” alla crescita economica, in particolare diritti di proprietà e stato di diritto. Inoltre, l’evidenza empirica supporta la tesi di un effetto indiretto complessivamente positivo della democrazia sulla crescita. Infine, si propone uno sviluppo ulteriore dell’analisi, concentrato sulle determinanti della democrazia, ricercando possibili concause nell’interazione con i processi economici.<br>Economic and political science literature show a wide consensus about the positive effect of property rights, contract enforcing arrangements and, more generally, economic institutions to long-run growth. Conversely, the linkage between democracy and growth remains unclear and not conclusively supported by empirical research. This work is an attempt to reconcile the stylized facts about democracy and growth –evidencing a long-run “synergic success” between the two terms – with theoretical and empirical literature. After thoroughly surveying the relevant literature on the topic, this study claims that the effect of democracy on long-run growth is indirect, channeled by the means of institutions. To test this hypothesis, the thesis provides an original analytical framework which is applied to a panel of 194 countries over the period 1961-2010, adopting a System-GMM estimation technique and a wide range of robustness controls. The results suggest that democracy is positively related to “better” (namely more growth-enhancing) institutions, especially with respect to economic institutions and rule of law. Hence, the findings suggest that the overall effect on growth is positive, indirect and channeled by institutions. However, since the results are not completely conclusive, a further investigation is suggested, on further determinants of democracy, potentially affecting its pro-growth effect.
APA, Harvard, Vancouver, ISO, and other styles
6

Karcher, Cody Jacob. "A heuristic for including black box analysis tools into a geometric programming formulation." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/112465.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Aeronautics and Astronautics, 2017.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 79-82).<br>Recently, geometric programming has been proposed as a powerful tool for enhancing aircraft conceptual design. While geometric programming has shown promise in early studies, current formulations preclude the designer from using black box analysis codes which are prolific in the aircraft design community. Previous work has shown the ability to fit data from these black box codes prior to the optimization run, however, this is often a time consuming and computationally expensive process that does not scale well to higher dimensional black boxes. Based upon existing iterative optimization methods, we propose a heuristic for including black box analysis codes in a geometric programming framework by utilizing sequential geometric programming (SGP). We demonstrate a heuristic SGP method and apply it to a solar powered aircraft using a black boxed GP compatible profile drag function. Using this heuristic algorithm, we achieve less than a 1% difference in the objective function between a direct implementation of the constraint and a black box implementation of the constraint.<br>by Cody Jacob Karcher.<br>S.M.
APA, Harvard, Vancouver, ISO, and other styles
7

Eskandari, Aram, and Benjamin Tellström. "Analysis of the Performance Impact of Black-box Randomization for 7 Sorting Algorithms." Thesis, KTH, Skolan för teknikvetenskap (SCI), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-231089.

Full text
Abstract:
Can black-box randomization change the performance of algorithms? The problem of worst-case behaviour in algorithms is difficult to handle, black-box randomization is one method that has not been rigorously tested. If it could be used to mitigate worst-case behaviour for our chosen algorithms, black-box randomization should be seriously considered for active usage in more algorithms. We have found variables that can be put through a black-box randomizer while our algorithm still gives correct output. These variables have been disturbed and a qualitative manual analysis has been done to observe the performance impact during black-box randomization. This analysis was done for 7 different sorting algorithms using Java openJDK 8. Our results show signs of improvement after black-box randomization, however our experiments showed a clear uncertainty when con- ducting time measurements for sorting algorithms.<br>Kan svartlåde-slumpning förändra prestandan hos algoritmer? Problemet med värsta-fall beteende hos algoritmer är svårt att hantera, svartlåde-slumpning är en metod som inte testast rigoröst än. Om det kan utnyttjas för att mildra värsta-fall beteende för våra utval- da algoritmer, bör svartlåde-slumpning beaktas för aktiv användning i fler algoritmer. Vi har funnit variabler som kan köras igenom svartlåde-slumpning samtidigt som vår algoritm ger korrekt utmatning. Dessa variabler har blivit utsatta för små störningar och en kvalitativ manuell ana- lys har gjorts för att observera huruvida prestandan förändrats under svartlåde-slumpning. Denna analys har gjorts för 7 sorteringsalgoritmer med hjälp av Java openJDK 8. Våra resultat visar tecken på förbättring efter svartlåde-slumpning, men våra experiment visade en klar osäkerhet när man utför tidsmätningar på sorteringsalgoritmer.
APA, Harvard, Vancouver, ISO, and other styles
8

Brown, Morgan Lorene. "Of Mice and Men: The Development and Analysis of a Black Box Production." Kent State University Honors College / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ksuhonors1575839244848729.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Menguy, Grégoire. "Black-box code analysis for reverse engineering through constraint acquisition and program synthesis." Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG023.

Full text
Abstract:
Les logiciels sont de plus en plus grands et complexes. Ainsi, certaines tâches comme le test et la vérification de code, ou la compréhension de code, sont de plus en plus difficiles à réaliser pour un humain. D'où la nécessité de développer des méthodes d'analyse automatique. Celles-ci sont usuellement en boîte blanche, utilisant la syntaxe du code pour déduire ses propriétés. Elles sont très efficaces mais présentent certaines limitations: le code source est nécessaire, la taille et la complexité syntaxique du code (accentuée par des optimisations et de l'obfuscation) impactent leur efficacité. Cette thèse explore comment les méthodes en boîte noire peuvent inférer des propriétés utiles pour la rétro-ingénierie. Nous étudions, tout d'abord, l'inférence de contrat de fonction qui tente d'apprendre sur quelles entrées une fonction peut être exécutée pour obtenir les sorties souhaitées. Nous adaptons l'acquisition de contraintes, en résolvant une de ses principales limitations: la dépendance à un être humain. En ressort PreCA, la première approche totalement boîte noire offrant des garanties claires de correction. PreCA est ainsi particulièrement approprié pour l'aide au développement. Nous étudions ensuite la déobfuscation, qui vise à simplifier du code obfusqué. Nous proposons Xyntia qui synthétise, via des S-métaheuristiques, une version compréhensible de blocs de code. Xyntia est plus rapide et robuste que l'état de l'art. De plus, nous proposons les deux premières protections contre la déobfuscation en boîte noire<br>Software always becomes larger and more complex, making crucial tasks like code testing, verification, or code understanding highly difficult for humans. Hence the need for methods to reason about code automatically. These are usually white-box, and use the code syntax to deduce its properties. While they have proven very powerful, they also show limitations: they need the source code, the code size and the data structures' complexity degrade their efficiency, they are highly impacted by syntactic code complexity amplified by optimizations obfuscations. This thesis explores how black-box code analysis can infer valuable properties for reverse engineering through data-driven learning. First, we consider the function contracts inference problem, which aims to infer over which inputs a code function can be executed to get good behaviors only. We extend the constraint acquisition learning framework, notably solving one of its major flaws: the dependency on a human user. It leads to PreCA, the first black-box approach enjoying clear theoretical guarantees. It makes PreCA especially suitable for development uses. Second, we consider the deobfuscation problem, which aims to simplify obfuscated code. Our proposal, Xyntia, synthesizes code block semantics through S-metaheuristics to offer an understandable version of the code. Xyntia significantly improves the state-of-the-art in terms of robustness and speed. In addition, we propose the two first protections efficient against black-box deobfuscation
APA, Harvard, Vancouver, ISO, and other styles
10

Jankovic, Anja. "Towards Online Landscape-Aware Algorithm Selection in Numerical Black-Box Optimization." Electronic Thesis or Diss., Sorbonne université, 2021. http://www.theses.fr/2021SORUS302.

Full text
Abstract:
Les algorithmes d'optimisation de boîte noire (BBOA) sont conçus pour des scénarios où les formulations exactes de problèmes sont inexistantes, inaccessibles, ou trop complexes pour la résolution analytique. Les BBOA sont le seul moyen de trouver une bonne solution à un tel problème. En raison de leur applicabilité générale, les BBOA présentent des comportements différents lors de l'optimisation de différents types de problèmes. Cela donne un problème de méta-optimisation consistant à choisir l'algorithme le mieux adapté à un problème particulier, appelé problème de sélection d'algorithmes (AS). La vision d'automatiser cette sélection a vite gagné du terrain dans la communauté. Un moyen important de le faire est l'AS tenant compte du paysage, où le choix de l'algorithme est basé sur la prédiction de ses performances via des représentations numériques d'instances de problèmes appelées caractéristiques. Un défi clé auquel l'AS tenant compte du paysage est confrontée est le coût de calcul de l'extraction des caractéristiques, une étape qui précède l'optimisation. Dans cette thèse, nous proposons une approche d'AS tenant compte du paysage basée sur la trajectoire de recherche qui intègre cette étape d'extraction dans celle d'optimisation. Nous montrons que les caractéristiques calculées à l'aide de la trajectoire conduisent à des prédictions robustes et fiables des performances des algorithmes, et à de puissants modèles d'AS construits dessus. Nous présentons aussi plusieurs analyses préparatoires, y compris une perspective de combinaison de 2 stratégies de régression complémentaires qui surpasse des modèles classiques de régression simple et amplifie la qualité du sélecteur<br>Black-box optimization algorithms (BBOAs) are conceived for settings in which exact problem formulations are non-existent, inaccessible, or too complex for an analytical solution. BBOAs are essentially the only means of finding a good solution to such problems. Due to their general applicability, BBOAs can exhibit different behaviors when optimizing different types of problems. This yields a meta-optimization problem of choosing the best suited algorithm for a particular problem, called the algorithm selection (AS) problem. By reason of inherent human bias and limited expert knowledge, the vision of automating the selection process has quickly gained traction in the community. One prominent way of doing so is via so-called landscape-aware AS, where the choice of the algorithm is based on predicting its performance by means of numerical problem instance representations called features. A key challenge that landscape-aware AS faces is the computational overhead of extracting the features, a step typically designed to precede the actual optimization. In this thesis, we propose a novel trajectory-based landscape-aware AS approach which incorporates the feature extraction step within the optimization process. We show that the features computed using the search trajectory samples lead to robust and reliable predictions of algorithm performance, and to powerful algorithm selection models built atop. We also present several preparatory analyses, including a novel perspective of combining two complementary regression strategies that outperforms any of the classical, single regression models, to amplify the quality of the final selector
APA, Harvard, Vancouver, ISO, and other styles
11

Saeed, Umar, and Ansur Mahmood Amjad. "ISTQB : Black Box testing Strategies used in Financial Industry for Functional testing." Thesis, Blekinge Tekniska Högskola, Sektionen för ingenjörsvetenskap, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3237.

Full text
Abstract:
Black box testing techniques are important to test the functionality of the system without knowing its inner detail which makes sure correct, consistent, complete and accurate behavior or function of a system. Black box testing strategies are used to test logical, data or behavioral dependencies, to generate test data and quality of test cases which have potential to guess more defects. Black box testing strategies play pivotal role to detect possible defects in system and can help in successful completion of system according to functionality. The studies of five companies regarding important black box testing strategies are presented in this thesis. This study explores the black box testing techniques which are present in literature and practiced in industry as well. Interview studies are conducted in companies of Pakistan providing solutions to finance industry, which is an attempt to find the usage of these techniques. The advantages and disadvantages of identified Black box testing strategies are discussed, along with it; the comparison of different techniques with respect to most defect guessing, dependencies, sophistication, effort, and cost is presented as well.
APA, Harvard, Vancouver, ISO, and other styles
12

Dalley, Jeffrey Brian. "The Seesaw of Organisational Social Capital Flows: Inside the "Black Box" of Social Exchange." Thesis, University of Canterbury. Management, 2011. http://hdl.handle.net/10092/6001.

Full text
Abstract:
The purpose of this study is to develop deeper understanding of the informal contributions of employees to organisational success; more specifically, the exchange ‘mechanism’ by which resources accrue to organisations through the social relationships of their members. The second purpose is to explore the influence of organisational contextual factors on this exchange mechanism; more specifically, the influence – if any – of contingent employment practices. Through the use of a qualitative research design, I have gained an in-depth understanding of the cognitive mechanism employed by organisational actors to arrive at a decision on whether or not to initiate social exchange, in order to facilitate the flow of organisational social capital. Data was analysed using Dimensional Analysis method. This analysis draws on the theoretical perspectives of interpretivism and symbolic interactionism, both of which are underpinned by a social construction epistemology. This provides the necessary link for understanding the connections between macro- and micro-level social action of social exchange in organisational settings. My findings identify a complex cognitive process employed by actors for the purpose of reaching a decision with respect to initiating social exchange in organisational settings. This process is termed Social Exchange Transaction Analysis. It is undertaken at the individual level and ultimately controls the flow of organisational social capital through a social network to the organisation. This complexity is a reflection of both the many dimensions of the phenomenon, and the interconnectedness and interactions between them. Social Exchange Transaction Analysis builds an ‘analytical’ picture of the potential social exchange transaction, to enable the organisational actor to arrive at a decision on whether or not to initiate social exchange – and thereby facilitate the flow of organisational social capital.
APA, Harvard, Vancouver, ISO, and other styles
13

Fruth, Jana. "Sensitivy analysis and graph-based methods for black-box functions with on application to sheet metal forming." Thesis, Saint-Etienne, EMSE, 2015. http://www.theses.fr/2015EMSE0779/document.

Full text
Abstract:
Le domaine général de la thèse est l’analyse de sensibilité de fonctions boîte noire. L’analyse de sensibilité étudie comment la variation d’une sortie peut être reliée à la variation des entrées. C’est un outil important dans la construction, l’analyse et l’optimisation des expériences numériques (computer experiments).Nous présentons tout d’abord l’indice d’interaction total, qui est utile pour le criblage d’interactions. Plusieurs méthodes d’estimation sont proposées. Leurs propriétés sont étudiées au plan théorique et par des simulations.Le chapitre suivant concerne l’analyse de sensibilité pour des modèles avec des entrées fonctionnelles et une sortie scalaire. Une approche séquentielle très économique est présentée, qui permet non seulement de retrouver la sensibilité de entrées fonctionnelles globalement, mais aussi d’identifier les régions d’intérêt dans leur domaine de définition.Un troisième concept est proposé, les support index functions, mesurant la sensibilité d’une entrée sur tout le support de sa loi de probabilité.Finalement les trois méthodes sont appliquées avec succès à l’analyse de sensibilité de modèles d’emboutissage<br>The general field of the thesis is the sensitivity analysis of black-box functions. Sensitivity analysis studies how the variation of the output can be apportioned to the variation of input sources. It is an important tool in the construction, analysis, and optimization of computer experiments.The total interaction index is presented, which can be used for the screening of interactions. Several variance-based estimation methods are suggested. Their properties are analyzed theoretically as well as on simulations.A further chapter concerns the sensitivity analysis for models that can take functions as input variables and return a scalar value as output. A very economical sequential approach is presented, which not only discovers the sensitivity of those functional variables as a whole but identifies relevant regions in the functional domain.As a third concept, support index functions, functions of sensitivity indices over the input distribution support, are suggested.Finally, all three methods are successfully applied in the sensitivity analysis of sheet metal forming models
APA, Harvard, Vancouver, ISO, and other styles
14

Khalil, Rana Fouad. "Why Johnny Still Can’t Pentest: A Comparative Analysis of Open-source Black-box Web Vulnerability Scanners." Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/38595.

Full text
Abstract:
Black-box web application vulnerability scanners are automated tools that are used to crawl a web application to look for vulnerabilities. These tools are often used in one of two ways. In the first approach, scanners are used as Point-and-Shoot tools where a scanner is only given the root URL of an application and asked to scan the site. Whereas, in the second approach, scanners are first configured to maximize the crawling coverage and vulnerability detection accuracy. Although the performance of leading commercial scanners has been thoroughly studied, very little research has been done to evaluate open-source scanners. This paper presents a feature and performance evaluation of five open-source scanners. We analyze the crawling coverage, vulnerability detection accuracy, scanning speed, report- ing and usability features. The scanners are tested against two well known benchmarks: WIVET and WAVSEP. Additionally, the scanners are tested against a realistic web application called WackoPicko. The chosen benchmarks are composed of a wide range of vulnerabilities and crawling challenges. Each scanner is tested in two modes: default and configured. Lastly, the scanners are compared with the state of the art commercial scanner Burp Suite Professional. Our results show that being able to properly crawl a web application is a critical task in detecting vulnerabilities. Unfortunately, the majority of the scanners evaluated had difficulty crawling through common web technologies such as dynamically generated JavaScript content and Flash applications. We also identified several classes of vulnerabilities that are not being detected by the scanners. Furthermore, our results show that scanners displayed considerable improvement when run in configured mode.
APA, Harvard, Vancouver, ISO, and other styles
15

Hanson, Ardis. "Unlocking the Black Box of Policymaking: A Discursive View of the Florida Commission on Mental Health and Substance Abuse." Scholar Commons, 2012. http://scholarcommons.usf.edu/etd/4327.

Full text
Abstract:
Discourse creates the world of policy. Discourse plays a key role within policy formation; political discourse is made visible within particular discursive (spoken and written) practices. Hence, mental health policy is the endpoint of a discursive process and that it is, in itself, an institutional process. The shared understanding necessary to formulate policy is crucial to persons who are responsible for policy decisions and recommendations. Since the public perception is that public policy problems are too complicated for ordinary people to deal with, the policy problem is reframed into manageable "bits." It is how these "bits" are framed, named, and made sense of that concern me most in the policymaking process. The purpose of this dissertation is to make visible the often invisible processes that occur in the creation of that final report. To do so, I use a discursive approach and a selection of discourse tokens, both talk and text, to examine the workings of the Florida Commission on Mental Health and Substance Abuse.
APA, Harvard, Vancouver, ISO, and other styles
16

Pina, Katia Oliveira. "Into the black box of Knowledge Intensive Business Services : understanding the knowledge bases, innovation and competitiveness of KIBS." Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/into-the-black-box-of-knowledge-intensive-business-services-understanding-the-knowledge-bases-innovation-and-competitiveness-of-kibs(6e9139fc-5a82-4378-88bd-5712d2aeef5b).html.

Full text
Abstract:
This dissertation focuses on Knowledge Intensive Business Services (KIBS). It aims to understand what these businesses are and to examine variety among them. In seeking to understand their diversity, I focus especially on the ‘knowledge bases’ at the core of their activities. The dissertation is based on three complementary studies. The first is a systematic review of the literature on KIBS. This is based primarily on a review of 130 carefully selected, relevant articles, and focuses on three questions: how are KIBS defined? how do KIBS compete? and how do KIBS innovate? The review shows that: (i) the literature is fragmented; most research does not build substantially on previous methods or findings; and (ii) while evidently heterogeneous, most of the literature has overlooked variety among KIBS. I also highlight what still needs to be known about KIBS.The second and third papers then focus on variety among KIBS, by classifying them according to their ‘knowledge bases’. In the first of these papers, I classify KIBS according to their primary knowledge bases, following the SAS Model, which identifies three: ‘analytical knowledge’, ‘synthetic knowledge’ and ‘symbolic knowledge’. Firms in three KIBS sectors: ‘architecture and engineering consultancy’; ‘specialist design’; and ‘computer and IT services’ are classified by their primary knowledge base according to information drawn from company websites. I then relate this classification to firm behaviour with respect to innovation, finding differences by primary knowledge base in the nature of the investments firms make to innovate, and in their propensities to innovate. In the second of the papers which relates ‘knowledge bases’ to KIBS, I develop the ‘knowledge bases’ approach conceptually, methodologically and empirically. Conceptually, I identify a hitherto unrecognised knowledge base: ‘compliance knowledge’. This relates to the knowledge of, and to interpretations of, laws and regulations. This knowledge base does not fit with any of the existing SAS types. Methodologically, I extract fuller information from company websites, and develop more sophisticated approaches to measurement, which allows multiple knowledge bases to be present in any one firm. Empirically, I successfully identify ‘compliance knowledge’, alongside ‘analytical’ and ‘symbolic knowledge’. I show that these are unevenly distributed across KIBS industries, including ‘advertising and design’, ‘architecture’, ‘engineering consultancy’ and ‘market research’, but importantly there is no one-to-one mapping between knowledge bases and industries. I discuss the implications of this, including for understanding the diversification of KIBS. This dissertation therefore contributes conceptually, methodologically and empirically to both understanding variety among KIBS and to the ‘knowledge bases’ literature.
APA, Harvard, Vancouver, ISO, and other styles
17

Deng, Nan. "Systems Support for Carbon-Aware Cloud Applications." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1439855103.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Campano, Erik. "Artificially Intelligent Black Boxes in Emergency Medicine : An Ethical Analysis." Thesis, Umeå universitet, Institutionen för psykologi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-160696.

Full text
Abstract:
Det blir allt vanligare att föreslå att icke-transparant artificiell intelligens, s.k. black boxes, används inom akutmedicinen. I denna uppsats används etisk analys för att härleda sju riktlinjer för utveckling och användning av black boxes i akutmedicin. Analysen är grundad på sju variationer av ett tankeexperiment som involverar en läkare, en black box och en patient med bröstsmärta på en akutavdelning. Grundläggande begrepp, inklusive artificiell intelligens, black boxes, metoder för transparens, akutmedicin och etisk analys behandlas detaljerat. Tre viktiga områden av etisk vikt identifieras: samtycke; kultur, agentskap och privatliv; och skyldigheter. Dessa områden ger upphov till de sju variationerna. För varje variation urskiljs en viktig etisk fråga som identifieras och analyseras. En riktlinje formuleras och dess etiska rimlighet testas utifrån konsekventialistiska och deontologiska metoder. Tillämpningen av riktlinjerna på medicin i allmänhet, och angelägenheten av fortsatt etiska analys av black boxes och artificiell intelligens inom akutmedicin klargörs.<br>Artificially intelligent black boxes are increasingly being proposed for emergency medicine settings; this paper uses ethical analysis to develop seven practical guidelines for emergency medicine black box creation and use. The analysis is built around seven variations of a thought experiment involving a doctor, a black box, and a patient presenting chest pain in an emergency department. Foundational concepts, including artificial intelligence, black boxes, transparency methods, emergency medicine, and ethical analysis are expanded upon. Three major areas of ethical concern are identified, namely consent; culture, agency, and privacy; and fault. These areas give rise to the seven variations. For each, a key ethical question it illustrates is identified and analyzed. A practical guideline is then stated, and its ethical acceptability tested using consequentialist and deontological approaches. The applicability of the guidelines to medicine more generally, and the urgency of continued ethical analysis of black box artificial intelligence in emergency medicine, are clarified.
APA, Harvard, Vancouver, ISO, and other styles
19

Vamja, Harsh. "Reverse Engineering of Finite State Machines from Sequential Circuits." University of Cincinnati / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1530267556456191.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Harston, Walter Andrew. "Facies and Reservoir Characterization of the Permian White Rim Sandstone, Black Box Dolomite, and Black Dragon Member of the Triassic Moenkopi Formation for CO2 Storage and Sequestration at Woodside Field, East-Central Utah." BYU ScholarsArchive, 2013. https://scholarsarchive.byu.edu/etd/3567.

Full text
Abstract:
Geologic sequestration of anthropogenic carbon dioxide (CO2) greenhouse gas emissions is an engineering solution that potentially reduces CO2 emissions released into the atmosphere thereby limiting their effect on climate change. This study focuses on Woodside field as a potential storage and sequestration site for CO2 emissions. The Woodside field is positioned on a doubly plunging, asymmetrical anticline on the northeast flank of the San Rafael Swell. Particular focus will be placed on the Permian White Rim Sandstone, Black Box Dolomite and Black Dragon Member of the Triassic Moenkopi Formation as the reservoir/seal system to store and sequester CO2 at Woodside field. The White Rim Sandstone, the primary target reservoir, is divided into three stratigraphic intervals based on facies analysis: a lower sand sheet facies (about 60 ft or 18 m), a thick middle eolian sandstone facies (about 390 ft or 119 m), and an upper marine reworked facies (about 70 ft or 21 m). Porosity and permeability analyses from the outcrop indicate good reservoir quality in the eolian sandstone and reworked facies. Porosity in the White Rim Sandstone ranges from 7.6 to 24.1% and permeability reaches up to 2.1 D. The maximum combined thickness of the three facies is 525 ft (160 m) at Woodside field providing a significant volume of porous and permeable rock in which to store CO2. The Black Box Dolomite is the secondary potential reservoir for CO2 storage at Woodside field and has a gross thickness up to 76 ft (23 m). The Black Box Dolomite is divided into four lithofacies: a basal nodular dolomudstone (8.2 -15 ft or 3.5-4.5 m), a dolowackestone (25-37 ft or 7.5-11 m), a dolomitic sandstone (0-8.2 ft or 0-2.5 m), and an upper sandy dolowackestone (0-16 ft or 0-4.9 m). Porosity and permeability analyses indicate reservoir potential in the dolowackestone, dolomitic sandstone, and sandy dolowackestone lithofacies. Porosity in the Black Box Dolomite ranges from 6.6 to 29.2% and permeability reaches up to 358 mD. The nodular dolomudstone lithofacies has relatively poor reservoir quality with porosity up to 9.4% and permeability up to 0.182 mD. This lithofacies could act as a baffle or barrier to fluid communication between the White Rim Sandstone and Black Box Dolomite. The Black Dragon Member of the Triassic Moenkopi Formation will serve as the seal rock for the relatively buoyant CO2 stored in the underlying formations. The Black Dragon Member is comprised of four lithofacies: a chert pebble conglomerate; an interbedded sandstone, siltstone, and shale; a trough cross-stratified sandstone, and an oolitic and algal limestone. The Black Dragon Member has a maximum thickness of 280 ft (85 m) at Woodside field. Mudstone beds contain from 0.16 to 0.47% porosity. QEMSCAN analysis indicates several minerals within shale beds that may react with a CO2-rich brine including calcite (18.73 to 23.43%), dolomite (7.56 to 7.89%), alkali feldspar (4.12 to 4.43 %), glauconite (0.04 to 0.05%), and plagioclase (0.03 to 0.04%). Silty mudstones comprise 75% of this member at Black Dragon Canyon. Volumetric estimates for Woodside field were calculated based on the 10th, 50th, and 90th percent probabilities (P10, P50, and P90). The White Rim Sandstone is the primary target reservoir and has capacity to hold 2.2, 8.8, or 23.7 million metric tonnes (P10, P50, and P90 respectively) of CO2 within the structural closure of Woodside field. The Black Box Dolomite may hold 0.5, 1.8, or 4.5 million metric tonnes, respectively of additional CO2 within the structural closure of Woodside field. These two formations combined have the capacity to store up to 28.3 million metric tonnes (P90) of CO2.
APA, Harvard, Vancouver, ISO, and other styles
21

Fruth, Jana [Verfasser], Sonja [Akademischer Betreuer] Kuhnt, Joachim [Gutachter] Kunert, and Clémentine [Gutachter] Prieur. "New methods for the sensitivity analysis of black-box functions with an application to sheet metal forming / Jana Fruth. Betreuer: Sonja Kuhnt. Gutachter: Joachim Kunert ; Clémentine Prieur." Dortmund : Universitätsbibliothek Dortmund, 2015. http://d-nb.info/1101605332/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Fruth, Jana Verfasser], Sonja [Akademischer Betreuer] [Kuhnt, Joachim [Gutachter] Kunert, and Clémentine [Gutachter] Prieur. "New methods for the sensitivity analysis of black-box functions with an application to sheet metal forming / Jana Fruth. Betreuer: Sonja Kuhnt. Gutachter: Joachim Kunert ; Clémentine Prieur." Dortmund : Universitätsbibliothek Dortmund, 2015. http://d-nb.info/1101605332/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Truong, Nghi Khue Dinh. "A web-based programming environment for novice programmers." Thesis, Queensland University of Technology, 2007. https://eprints.qut.edu.au/16471/1/Nghi_Truong_Thesis.pdf.

Full text
Abstract:
Learning to program is acknowledged to be difficult; programming is a complex intellectual activity and cannot be learnt without practice. Research has shown that first year IT students presently struggle with setting up compilers, learning how to use a programming editor and understanding abstract programming concepts. Large introductory class sizes pose a great challenge for instructors in providing timely, individualised feedback and guidance for students when they do their practice. This research investigates the problems and identifies solutions. An interactive and constructive web-based programming environment is designed to help beginning students learn to program in high-level, object-oriented programming languages such as Java and C#. The environment eliminates common starting hurdles for novice programmers and gives them the opportunity to successfully produce working programs at the earliest stage of their study. The environment allows students to undertake programming exercises anytime, anywhere, by "filling in the gaps" of a partial computer program presented in a web page, and enables them to receive guidance in getting their programs to compile and run. Feedback on quality and correctness is provided through a program analysis framework. Students learn by doing, receiving feedback and reflecting - all through the web. A key novel aspect of the environment is its capability in supporting small &quotfill in the gap" programming exercises. This type of exercise places a stronger emphasis on developing students' reading and code comprehension skills than the traditional approach of writing a complete program from scratch. It allows students to concentrate on critical dimensions of the problem to be solved and reduces the complexity of writing programs.
APA, Harvard, Vancouver, ISO, and other styles
24

Truong, Nghi Khue Dinh. "A web-based programming environment for novice programmers." Queensland University of Technology, 2007. http://eprints.qut.edu.au/16471/.

Full text
Abstract:
Learning to program is acknowledged to be difficult; programming is a complex intellectual activity and cannot be learnt without practice. Research has shown that first year IT students presently struggle with setting up compilers, learning how to use a programming editor and understanding abstract programming concepts. Large introductory class sizes pose a great challenge for instructors in providing timely, individualised feedback and guidance for students when they do their practice. This research investigates the problems and identifies solutions. An interactive and constructive web-based programming environment is designed to help beginning students learn to program in high-level, object-oriented programming languages such as Java and C#. The environment eliminates common starting hurdles for novice programmers and gives them the opportunity to successfully produce working programs at the earliest stage of their study. The environment allows students to undertake programming exercises anytime, anywhere, by "filling in the gaps" of a partial computer program presented in a web page, and enables them to receive guidance in getting their programs to compile and run. Feedback on quality and correctness is provided through a program analysis framework. Students learn by doing, receiving feedback and reflecting - all through the web. A key novel aspect of the environment is its capability in supporting small &quotfill in the gap" programming exercises. This type of exercise places a stronger emphasis on developing students' reading and code comprehension skills than the traditional approach of writing a complete program from scratch. It allows students to concentrate on critical dimensions of the problem to be solved and reduces the complexity of writing programs.
APA, Harvard, Vancouver, ISO, and other styles
25

VUOTTO, SIMONE. "Formal Requirements Analysis and Specification-Based Testing in Cyber-Physical Systems." Doctoral thesis, Università degli studi di Genova, 2021. http://hdl.handle.net/11567/1035085.

Full text
Abstract:
Formal requirements analysis plays an important role in the design of safety- and security-critical complex systems such as, e.g., Cyber-Physical Systems (CPS). It can help in detecting problems early in the system development life-cycle, reducing time and cost to completion. Moreover, its results can be employed at the end of the process to validate the implemented system, guiding the testing phase. Despite its importance, requirements analysis is still largely carried out manually due to the intrinsic difficulty of dealing with natural language requirements, the most common way to represent them. However, manual reviews are time-consuming and error-prone, reducing the potential benefit of the requirement engineering process. Automation can be achieved with the employment of formal methods, but their application is still limited by their complexity and lack of specialized tools. In this work we focus on the analysis of requirements for the design of CPSs, and on how to automatize some activities related to such analysis. We first study how to formalize requirements expressed in a structured English language, encode them in linear temporal logic, check their consistency with off-the-shelf model checkers, and find minimal set of conflicting requirements in case of inconsistency. We then present a new methodology to automatically generate tests from requirements and execute them on a given system, without requiring knowledge of its internal structure. Finally, we provide a set of tools that implement the studied algorithms and provide easy-to-use interfaces to help their adoption from the users.
APA, Harvard, Vancouver, ISO, and other styles
26

Treloar, Graham John, and edu au jillj@deakin edu au mikewood@deakin edu au wildol@deakin edu au kimg@deakin. "A Comprehensive Embodied Energy Analysis Framework." Deakin University. School of Architecture and Building, 1998. http://tux.lib.deakin.edu.au./adt-VDU/public/adt-VDU20041209.161722.

Full text
Abstract:
The assessment of the direct and indirect requirements for energy is known as embodied energy analysis. For buildings, the direct energy includes that used primarily on site, while the indirect energy includes primarily the energy required for the manufacture of building materials. This thesis is concerned with the completeness and reliability of embodied energy analysis methods. Previous methods tend to address either one of these issues, but not both at the same time. Industry-based methods are incomplete. National statistical methods, while comprehensive, are a ‘black box’ and are subject to errors. A new hybrid embodied energy analysis method is derived to optimise the benefits of previous methods while minimising their flaws. In industry-based studies, known as ‘process analyses’, the energy embodied in a product is traced laboriously upstream by examining the inputs to each preceding process towards raw materials. Process analyses can be significantly incomplete, due to increasing complexity. The other major embodied energy analysis method, ‘input-output analysis’, comprises the use of national statistics. While the input-output framework is comprehensive, many inherent assumptions make the results unreliable. Hybrid analysis methods involve the combination of the two major embodied energy analysis methods discussed above, either based on process analysis or input-output analysis. The intention in both hybrid analysis methods is to reduce errors associated with the two major methods on which they are based. However, the problems inherent to each of the original methods tend to remain, to some degree, in the associated hybrid versions. Process-based hybrid analyses tend to be incomplete, due to the exclusions associated with the process analysis framework. However, input-output-based hybrid analyses tend to be unreliable because the substitution of process analysis data into the input-output framework causes unwanted indirect effects. A key deficiency in previous input-output-based hybrid analysis methods is that the input-output model is a ‘black box’, since important flows of goods and services with respect to the embodied energy of a sector cannot be readily identified. A new input-output-based hybrid analysis method was therefore developed, requiring the decomposition of the input-output model into mutually exclusive components (ie, ‘direct energy paths’). A direct energy path represents a discrete energy requirement, possibly occurring one or more transactions upstream from the process under consideration. For example, the energy required directly to manufacture the steel used in the construction of a building would represent a direct energy path of one non-energy transaction in length. A direct energy path comprises a ‘product quantity’ (for example, the total tonnes of cement used) and a ‘direct energy intensity’ (for example, the energy required directly for cement manufacture, per tonne). The input-output model was decomposed into direct energy paths for the ‘residential building construction’ sector. It was shown that 592 direct energy paths were required to describe 90% of the overall total energy intensity for ‘residential building construction’. By extracting direct energy paths using yet smaller threshold values, they were shown to be mutually exclusive. Consequently, the modification of direct energy paths using process analysis data does not cause unwanted indirect effects. A non-standard individual residential building was then selected to demonstrate the benefits of the new input-output-based hybrid analysis method in cases where the products of a sector may not be similar. Particular direct energy paths were modified with case specific process analysis data. Product quantities and direct energy intensities were derived and used to modify some of the direct energy paths. The intention of this demonstration was to determine whether 90% of the total embodied energy calculated for the building could comprise the process analysis data normally collected for the building. However, it was found that only 51% of the total comprised normally collected process analysis. The integration of process analysis data with 90% of the direct energy paths by value was unsuccessful because: • typically only one of the direct energy path components was modified using process analysis data (ie, either the product quantity or the direct energy intensity); • of the complexity of the paths derived for ‘residential building construction’; and • of the lack of reliable and consistent process analysis data from industry, for both product quantities and direct energy intensities. While the input-output model used was the best available for Australia, many errors were likely to be carried through to the direct energy paths for ‘residential building construction’. Consequently, both the value and relative importance of the direct energy paths for ‘residential building construction’ were generally found to be a poor model for the demonstration building. This was expected. Nevertheless, in the absence of better data from industry, the input-output data is likely to remain the most appropriate for completing the framework of embodied energy analyses of many types of products—even in non-standard cases. ‘Residential building construction’ was one of the 22 most complex Australian economic sectors (ie, comprising those requiring between 592 and 3215 direct energy paths to describe 90% of their total energy intensities). Consequently, for the other 87 non-energy sectors of the Australian economy, the input-output-based hybrid analysis method is likely to produce more reliable results than those calculated for the demonstration building using the direct energy paths for ‘residential building construction’. For more complex sectors than ‘residential building construction’, the new input-output-based hybrid analysis method derived here allows available process analysis data to be integrated with the input-output data in a comprehensive framework. The proportion of the result comprising the more reliable process analysis data can be calculated and used as a measure of the reliability of the result for that product or part of the product being analysed (for example, a building material or component). To ensure that future applications of the new input-output-based hybrid analysis method produce reliable results, new sources of process analysis data are required, including for such processes as services (for example, ‘banking’) and processes involving the transformation of basic materials into complex products (for example, steel and copper into an electric motor). However, even considering the limitations of the demonstration described above, the new input-output-based hybrid analysis method developed achieved the aim of the thesis: to develop a new embodied energy analysis method that allows reliable process analysis data to be integrated into the comprehensive, yet unreliable, input-output framework. Plain language summary Embodied energy analysis comprises the assessment of the direct and indirect energy requirements associated with a process. For example, the construction of a building requires the manufacture of steel structural members, and thus indirectly requires the energy used directly and indirectly in their manufacture. Embodied energy is an important measure of ecological sustainability because energy is used in virtually every human activity and many of these activities are interrelated. This thesis is concerned with the relationship between the completeness of embodied energy analysis methods and their reliability. However, previous industry-based methods, while reliable, are incomplete. Previous national statistical methods, while comprehensive, are a ‘black box’ subject to errors. A new method is derived, involving the decomposition of the comprehensive national statistical model into components that can be modified discretely using the more reliable industry data, and is demonstrated for an individual building. The demonstration failed to integrate enough industry data into the national statistical model, due to the unexpected complexity of the national statistical data and the lack of available industry data regarding energy and non-energy product requirements. These unique findings highlight the flaws in previous methods. Reliable process analysis and input-output data are required, particularly for those processes that were unable to be examined in the demonstration of the new embodied energy analysis method. This includes the energy requirements of services sectors, such as banking, and processes involving the transformation of basic materials into complex products, such as refrigerators. The application of the new method to less complex products, such as individual building materials or components, is likely to be more successful than to the residential building demonstration.
APA, Harvard, Vancouver, ISO, and other styles
27

Dalcastagnè, Manuel. "Noise and Hotel Revenue Management in Simulation-based Optimization." Doctoral thesis, Università degli studi di Trento, 2021. http://hdl.handle.net/11572/319438.

Full text
Abstract:
Several exact and approximate dynamic programming formulations have already been proposed to solve hotel revenue management (RM) problems. To obtain tractable solutions, these methods are often bound by simplifying assumptions which prevent their application on large and dynamic complex systems. This dissertation introduces HotelSimu, a flexible simulation-based optimization approach for hotel RM, and investigates possible approaches to increase the efficiency of black-box optimization methods in the presence of noise. In fact, HotelSimu employs black-box optimization and stochastic simulation to find the dynamic pricing policy which is expected to maximize the revenue of a given hotel in a certain period of time. However, the simulation output is noisy and different solutions should be compared in a statistically significant manner. Various black-box heuristics based on variations of random local search are investigated and integrated with statistical analysis techniques in order to manage efficiently the optimization budget.
APA, Harvard, Vancouver, ISO, and other styles
28

Dalcastagnè, Manuel. "Noise and Hotel Revenue Management in Simulation-based Optimization." Doctoral thesis, Università degli studi di Trento, 2021. http://hdl.handle.net/11572/319438.

Full text
Abstract:
Several exact and approximate dynamic programming formulations have already been proposed to solve hotel revenue management (RM) problems. To obtain tractable solutions, these methods are often bound by simplifying assumptions which prevent their application on large and dynamic complex systems. This dissertation introduces HotelSimu, a flexible simulation-based optimization approach for hotel RM, and investigates possible approaches to increase the efficiency of black-box optimization methods in the presence of noise. In fact, HotelSimu employs black-box optimization and stochastic simulation to find the dynamic pricing policy which is expected to maximize the revenue of a given hotel in a certain period of time. However, the simulation output is noisy and different solutions should be compared in a statistically significant manner. Various black-box heuristics based on variations of random local search are investigated and integrated with statistical analysis techniques in order to manage efficiently the optimization budget.
APA, Harvard, Vancouver, ISO, and other styles
29

Amin, Gaurav Shirish. "Investing in hedge funds : analysing the 'black box'." Thesis, University of Reading, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.250708.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Gibbs, Ronald L. "Uprating of underground cable circuits." Thesis, Queensland University of Technology, 1998. https://eprints.qut.edu.au/36057/1/36057_Gibbs_1998.pdf.

Full text
Abstract:
Electricity Authorities in Australia are under ever increasing pressure to improve overall efficiency, minimise costs, and provide a better return on their investment. One of the areas that can be targeted is the better utilisation of existing plant and equipment. The objective of this study was to investigate the conventional method of cable modeling and to research new approaches and develop novel techniques that could facilitate the wider application of cable uprating techniques to underground cable circuits. The benefits and shortcomings of the conventional method are discussed. The study shows that the one-dimensional model can give good agreement and could thus form the basis of an on line model. It identified that the current cyclic model is incapable of modeling the change in loads that occur typically at weekends. It has also shown that fixed standard values of thermal resistivity and diffusivity do not allow the model to respond to the dynamic changes in these values on a day to day basis. It is important to have continually updated values of ambient temperature and soil thermal parameters. A novel transducer, the Transient Thermal Sphere, was further developed and investigated. The results of an in situ field investigation were analysed. The values of thermal resistivity and diffusivity obtained from the thermal spheres in the field trials were in the expected range for soils of their type. However, they yielded values that were lower than the traditionally assumed values and varied on a daily basis due to other environmental conditions. A third study was to apply an entirely novel approach and implement System Identification Techniques. It was clearly determined that a 101h order ARX model could predict the temperature of the cable sheath 2.5 hours ahead with a Mean Square Error ofless than 0.0625°C.
APA, Harvard, Vancouver, ISO, and other styles
31

Rapoport, Robert S. "The iterative frame : algorithmic video editing, participant observation & the black box." Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:8339bcb5-79f2-44d1-b78d-7bd28aa1956e.

Full text
Abstract:
Machine learning is increasingly involved in both our production and consumption of video. One symptom of this is the appearance of automated video editing applications. As this technology spreads rapidly to consumers, the need for substantive research about its social impact grows. To this end, this project maintains a focus on video editing as a microcosm of larger shifts in cultural objects co-authored by artificial intelligence. The window in which this research occurred (2010-2015) saw machine learning move increasingly into the public eye, and with it ethical concerns. What follows is, on the most abstract level, a discussion of why these ethical concerns are particularly urgent in the realm of the moving image. Algorithmic editing consists of software instructions to automate the creation of timelines of moving images. The criteria that this software uses to query a database is variable. Algorithmic authorship already exists in other media, but I will argue that the moving image is a separate case insofar as the raw material of text and music software can develop on its own. The performance of a trained actor can still not be generated by software. Thus, my focus is on the relationship between live embodied performance, and the subsequent algorithmic editing of that footage. This is a process that can employ other software like computer vision (to analyze the content of video) and predictive analytics (to guess what kind of automated film to make for a given user). How is performance altered when it has to communicate to human and non-human alike? The ritual of the iterative frame gives literal form to something that throughout human history has been a projection: the omniscient participant observer, more commonly known as the Divine. We experience black boxed software (AI's, specifically neural networks, which are intrinsically opaque) as functionally omniscient and tacitly allow it to edit more and more of life (e.g. filtering articles, playlists and even potential spouses). As long as it remains disembodied, we will continue to project the Divine on to the black box, causing cultural anxiety. In other words, predictive analytics alienate us from the source code of our cultural texts. The iterative frame then is a space in which these forces can be inscribed on the body, and hence narrated. The algorithmic editing of content is already taken for granted. The editing of moving images, in contrast, still requires a human hand. We need to understand the social power of moving image editing before it is delegated to automation. Practice Section: This project is practice-led, meaning that the portfolio of work was produced as it was being theorized. To underscore this, the portfolio comes at the end of the document. Video editors use artificial intelligence (AI) in a number of different applications, from deciding the sequencing of timelines to using facial and language detection to find actors in archives. This changes traditional production workflows on a number of levels. How can the single decision cut a between two frames of video speak to the larger epistemological shifts brought on by predictive analytics and Big Data (upon which they rely)? When predictive analytics begin modeling the world of moving images, how will our own understanding of the world change? In the practice-based section of this thesis, I explore how these shifts will change the way in which actors might approach performance. What does a gesture mean to AI and how will the editor decontextualize it? The set of a video shoot that will employ an element of AI in editing represents a move towards ritualization of production, summarized in the term the 'iterative frame'. The portfolio contains eight works that treat the set was taken as a microcosm of larger shifts in the production of culture. There is, I argue, metaphorical significance in the changing understanding of terms like 'continuity' and 'sync' on the AI-watched set. Theory Section In the theoretical section, the approach is broadly comparative. I contextualize the current dynamic by looking at previous shifts in technology that changed the relationship between production and post-production, notably the lightweight recording technology of the 1960s. This section also draws on debates in ethnographic filmmaking about the matching of film and ritual. In this body of literature, there is a focus on how participant observation can be formalized in film. Triangulating between event, participant observer and edit grammar in ethnographic filmmaking provides a useful analogy in understanding how AI as film editor might function in relation to contemporary production. Rituals occur in a frame that is dependent on a spatially/temporally separate observer. This dynamic also exists on sets bound for post-production involving AI, The convergence of film grammar and ritual grammar occurred in the 1960s under the banner of cinéma vérité in which the relationship between participant observer/ethnographer and the subject became most transparent. In Rouch and Morin's Chronicle of a Summer (1961), reflexivity became ritualized in the form of on-screen feedback sessions. The edit became transparent-the black box of cinema disappeared. Today as artificial intelligence enters the film production process this relationship begins to reverse-feedback, while it exists, becomes less transparent. The weight of the feedback ritual gets gradually shifted from presence and production to montage and post-production. Put differently, in cinéma vérité, the participant observer was most present in the frame. As participant observation gradually becomes shared with code it becomes more difficult to give it an embodied representation and thus its presence is felt more in the edit of the film. The relationship between the ritual actor and the participant observer (the algorithm) is completely mediated by the edit, a reassertion of the black box, where once it had been transparent. The crucible for looking at the relationship between algorithmic editing, participant observation and the black box is the subject in trance. In ritual trance the individual is subsumed by collective codes. Long before the advent of automated editing trance was an epistemological problem posed to film editing. In the iterative frame, for the first time, film grammar can echo ritual grammar and indeed become continuous with it. This occurs through removing the act of cutting from the causal world, and projecting this logic of post-production onto performance. Why does this occur? Ritual and specifically ritual trance is the moment when a culture gives embodied form to what it could not otherwise articulate. The trance of predictive analytics-the AI that increasingly choreographs our relationship to information-is the ineffable that finds form in the iterative frame. In the iterative frame a gesture never exists in a single instance, but in a potential state. The performers in this frame begin to understand themselves in terms of how automated indexing processes reconfigure their performance. To the extent that gestures are complicit with this mode of databasing they can be seen as votive toward the algorithmic. The practice section focuses on the poetics of this position. Chapter One focuses on cinéma vérité as a moment in which the relationship between production and post-production shifted as a function of more agile recording technology, allowing the participant observer to enter the frame. This shift becomes a lens to look at changes that AI might bring. Chapter Two treats the work of Pierre Huyghe as a 'liminal phase' in which a new relationship between production and post-production is explored. Finally, Chapter Three looks at a film in which actors perform with awareness that footage will be processed by an algorithmic edit.<br>The conclusion looks at the implications this way of relating to AI-especially commercial AI-through embodied performance could foster a more critical relationship to the proliferating black-boxed modes of production.
APA, Harvard, Vancouver, ISO, and other styles
32

Blumen, Sacha Carl. "Granularity and state socialisation: explaining Germany’s 2015 refugee policy reversal." Thesis, Canberra, ACT : The Australian National University, 2016. http://hdl.handle.net/1885/111430.

Full text
Abstract:
Between late August and mid-November 2015, the German Government liberalised its refugee policy to allow an unlimited number of people to claim asylum in the country, and then made a near-reversal on this policy by calling for European-wide quotas on the number of refugees entering the EU and a reduction in the number of refugees Germany would admit. The German Government’s decisions to liberalise and then backtrack on its refugee policy within a short time period, at a time when many people were still seeking asylum from the Syrian civil war, present a puzzle to the dominant International Relations theories of state socialisation—constructivism and rational choice—which do not explain well this type of observed real world behaviour. By using the Foreign Policy Analysis literature to augment the constructivist and rational choice approaches, I argue that a more granular approach can help explain Germany’s backtracking on refugee policy in 2015. I focus on the domestic actors, institutions, and the contested processes of their interactions from which state policy emerged. Using this approach, I explain Germany’s backtracking on its refugee policy as the result of varying sets of interactions over time among actors who had different and potentially changing interests and beliefs. This focus on granularity and contestation within state policy making processes provides a more precise understanding of the dynamics of policy making from which we gain a greater insight into this puzzling example of state behaviour. Such approaches may also help explain other examples of state behaviour that are similarly mysterious.
APA, Harvard, Vancouver, ISO, and other styles
33

"Next Generation Black-Box Web Application Vulnerability Analysis Framework." Master's thesis, 2017. http://hdl.handle.net/2286/R.I.44256.

Full text
Abstract:
abstract: Web applications are an incredibly important aspect of our modern lives. Organizations and developers use automated vulnerability analysis tools, also known as scanners, to automatically find vulnerabilities in their web applications during development. Scanners have traditionally fallen into two types of approaches: black-box and white-box. In the black-box approaches, the scanner does not have access to the source code of the web application whereas a white-box approach has access to the source code. Today’s state-of-the-art black-box vulnerability scanners employ various methods to fuzz and detect vulnerabilities in a web application. However, these scanners attempt to fuzz the web application with a number of known payloads and to try to trigger a vulnerability. This technique is simple but does not understand the web application that it is testing. This thesis, presents a new approach to vulnerability analysis. The vulnerability analysis module presented uses a novel approach of Inductive Reverse Engineering (IRE) to understand and model the web application. IRE first attempts to understand the behavior of the web application by giving certain number of input/output pairs to the web application. Then, the IRE module hypothesizes a set of programs (in a limited language specific to web applications, called AWL) that satisfy the input/output pairs. These hypotheses takes the form of a directed acyclic graph (DAG). AWL vulnerability analysis module can then attempt to detect vulnerabilities in this DAG. Further, it generates the payload based on the DAG, and therefore this payload will be a precise payload to trigger the potential vulnerability (based on our understanding of the program). It then tests this potential vulnerability using the generated payload on the actual web application, and creates a verification procedure to see if the potential vulnerability is actually vulnerable, based on the web application’s response.<br>Dissertation/Thesis<br>Masters Thesis Computer Science 2017
APA, Harvard, Vancouver, ISO, and other styles
34

Sun, LinBo, and 孫林波. "Black Box Deconstructs and Performance Analysis On Academic Commercialization." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/6z65tk.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Nevile, Maurice. "Beyond the black box : talk-in-interaction in the airline cockpit." Phd thesis, 2001. http://hdl.handle.net/1885/147619.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Lin, Wei-Yu. "The Design and Analysis of Black-Box Test Plan for a Message-Based Communication System." 2007. http://www.cetd.com.tw/ec/thesisdetail.aspx?etdun=U0001-2707200710441600.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Lin, Wei-Yu, and 林偉裕. "The Design and Analysis of Black-Box Test Plan for a Message-Based Communication System." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/91211739792705771582.

Full text
Abstract:
碩士<br>臺灣大學<br>電機工程學研究所<br>95<br>The cornerstone of an effective test program is test planning. In software development, a test plan is a document that describes the objectives, requirements, test approaches, and the testcase designs and executions. In our team, we have developed a testcase graphical editor that allows the users to draw high-level test cases in MSCs (Message Sequence Charts) and a test compiler that translates MSCs to test executables in C/C++. We have also developed a configurable mobile phone simulator with versatility for the general capabilities that we may expect from a mobile phone, like dialing, call-answering, MP3 playing, calculator operation The main purpose of this paper is to propose a test plan for the system under test(SUT) and test compiler. The test plan has requirements which covers: • Functionality evaluation. • Performance measurement. • Reusable testcase design and execution
APA, Harvard, Vancouver, ISO, and other styles
38

Lin, Ta-Yi, and 林大益. "An Analysis of The Black Box of The Relationship Between Corporate Social Responsibly and Corporate Financial Performance." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/65y4e7.

Full text
Abstract:
碩士<br>國立交通大學<br>企業管理碩士學程<br>101<br>This purpose of this study is to examine if there exists any mediator between corporate social responsibly (CSR) and corporate financial performance (CFP). Specifically, this study focuses on a possible mediating role of innovation capital, one of mediators recommended in the literature between CSR and CFP. In addition to that, another moderator, employee ownership, is examined if it moderates among CSR, innovation capital, and CFP. This study employs data from KLD for CSR, Compustat for financial statement variables (e.g. innovation capital or financial performance), and ERISA Form 5500 for employee ownership. The resulting data include 244 companies with CSR from year 2006, innovation capital from year 2007, employee ownership from year 2006, and CFP from year 2010. For analysis, this study uses both AMOS and SAS. The result indicates that CSR significantly positively influences on CFP as innovation capital significantly positively influences on CFP. However, the influence of CSR on innovation capital is not significant. On the other hand, the employee ownership moderates between innovation capital and CFP.
APA, Harvard, Vancouver, ISO, and other styles
39

Neves, Francisco Nuno Teixeira. "Holistic performance and scalability analysis for large-scale distributed systems." Doctoral thesis, 2021. http://hdl.handle.net/1822/75536.

Full text
Abstract:
Programa doutoral em Informática - MAP-i<br>Internet services play a critical role day-by-day in personal and professional lives. Monitoring these services is of paramount importance for quick detection and solving anomalies in order to reduce downtime and persistence of erroneous behavior, otherwise they end up causing serious negative impact. The availability of detailed data about the behavior of the system as a whole is crucial for analysis and troubleshooting anomalies. Although most software components individually provide outputs that developers foresee useful to diagnose performance and behavioral issues, they are not correlated in a causally-consistent fashion. Collecting cross-component outputs, in turn, requires substantial effort in instrumenting the source code of each software component for enabling the generation of causally-correlated outputs, which is not always viable given the heterogeneity architecture’s components and unaccessible portions of code. This renders a trade-off between black-box outputs and the detail they provide. In this dissertation, we explore this trade-off in order to enable effective analysis and troubleshooting of complex distributed systems without requiring application-specific knowledge. This was achieved by operating at the common layer of all software components, through which all interactions are performed. Specifically, we focused on exploring the operating system’s kernel as the layer of observation to extract information about the behavior of the system. Our approach relies on events triggered by distributed processes within applicational components and their well-known causal relations to aid diagnosing performance and behavior anomalies. Using three case studies, we validated that our approach represents a step forward towards observable complex distributed architectures without application-specific knowledge.<br>Os serviços na Internet desempenham um papel fundamental no dia-a-dia de milhões de pessoas e empresas. A monitorização destes serviços é por isso essencial para a deteção e resolução de anomalias de modo a reduzir indisponibilidade e a persistência de erros. Para resolver anomalias de forma eficaz, é necessário obter dados detalhados sobre o comportamento do sistema como um todo. Embora a maioria das componentes de software forneça isoladamente dados que os programadores consideram úteis para diagnosticar problemas de desempenho e comportamento, estes não incluem informação suficiente para obter uma visão causalmente coerente do comportamento do sistema como um todo. Para que estes dados forneçam uma visão coerente do comportamento do sistema, é exigido um esforço substancial na instrumentação do código-fonte de cada componente. No entanto, isto nem sempre é viável, dada a natureza heterogénea da arquitetura das componentes e até mesmo o facto de terem porções de código-fonte inacessíveis. Assim, existe um compromisso entre os dados gerados isoladamente pelas componentes, cujo comportamento em detalhe é desconhecido, e o detalhe que estes fornecem sobre o comportamento do sistema como um todo. Nesta dissertação, explora-se esse compromisso de modo a permitir a análise e resolução de problemas eficazes de sistemas distribuídos complexos, sem necessitar de conhecimento específico do funcionamento interno das suas componentes de software. Para atingir este objetivo, é apresentada uma abordagem que estebelece o núcleo do sistema operativo como o nível de observação para extrair informação sobre o comportamento do sistema. Esta abordagem assenta sobre eventos despoletados por processos distribuídos pertencentes às componentes de software aplicacionais, bem como nas suas relações causais previamente conhecidas. Utilizando três casos de estudo, conclui-se que a abordagem proposta nesta dissertação representa um passo em frente na observação de sistemas distribuídos complexos sem conhecimento específico das suas componentes aplicacionais.<br>Ao INESC TEC, pelo apoio e financiamento parcial deste doutoramento. À Fundação para a Ciência e a Tecnologia, pelo apoio financeiro durante os últimos três anos através da bolsa de doutoramento SFRH/BD/129771/2017
APA, Harvard, Vancouver, ISO, and other styles
40

Filipe, Ricardo Ângelo Santos. "Client-Side Monitoring of Distributed Systems." Doctoral thesis, 2020. http://hdl.handle.net/10316/91181.

Full text
Abstract:
Tese no âmbito do Programa de Doutoramento em Ciências e Tecnologias da Informação apresentada à Faculdade de Ciências e Tecnologia da Universidade de Coimbra<br>From critical systems to entertainment, most computer systems have become distributed. Compared to standalone applications, distributed systems are more complex, dificult to operate and maintain, thus increasing the probability for outages or other malfunctions. Properly monitoring the system is therefore even more important. However, recovering a complete image of the system is a herculean task for administrators, who often need to resort to a large plethora of tools. Despite all these tools, the person that many times identifies the degradation or the system outage is the one that is somehow disregarded in the monitoring chain - the client. Almost daily, we have examples in the news from companies that had outages or system degradation perceived by the final client with a direct impact on the companies' revenues and image. The lack of client-side monitoring and the opportunity to improve current monitoring mechanisms paved the way for the key research question in this thesis. We argue that the client has information on the distributed system that monitoring applications should use to improve performance and resilience. In this work, we aim to evaluate the limits of black-box client-side monitoring and to extend white-box with client information. Additionally, we are very interested to understand what kind of information does the system leak to the client. To evaluate this approach, we resorted to several experiments in distinct scenarios from three-tier web sites to microservice architectures, where we tried to identify performance issues from the client-side point-of-view. We used client profiling, machine learning techniques among other methods, to demonstrate that using client information may serve to improve the observability of a distributed system. Properly including client-side information proved to be an interesting and challenging research effort. We believe that our work contributed to advance the current state-ofart in distributed system monitoring. The client has viable information that eludes administrators and provides important insights on the system.<br>Desde os sistemas críticos ao entretenimento, a maioria dos sistemas computacionais tornou-se distribuída. Quando comparados a aplicações monolíticas, os sistemas distribu ídos são mais complexos, difíceis de operar e manter, aumentando assim a probabilidade de anomalias. A monitoria de um sistema distribuído é desta forma ainda mais importante. Todavia, obter uma imagem completa do sistema é uma tarefa árdua para os administradores, que frequentemente precisam de recorrer a uma grande variedade de ferramentas. Mesmo com a superabundância de ferramentas, a pessoa que muitas vezes identi ca a degradação ou a interrupção do sistema é a mesma que de alguma forma é desconsiderada no uxo de monitoria: o cliente. Quase diariamente, temos exemplos na comunicação social de empresas que tiveram interrupções ou degradação no serviço prestado percebido pelo cliente nal, com impacto direto nas receitas e na imagem dessas empresas. A falta de monitoria do ponto de vista do cliente e a oportunidade de melhorar a monitoria atual abriram o caminho para a questão chave de pesquisa nesta tese. Argumentamos que o cliente possui informação sobre o sistema distribuído que as ferramentas de monitora devem usar para melhorar o desempenho e resiliência. Neste trabalho pretendemos avaliar os limites de uma monitoria do lado do cliente de uma forma caixa-negra , e extender as soluções de caixa-branca com informação do cliente. Além disso, estamos também interessados em entender que tipo de informação é que o sistema escapa para o cliente. Para avaliar esta abordagem, recorremos a várias experiências em cenários distintos desde sites de três camadas até arquiteturas de micro serviços, onde tentamos identi- car problemas do ponto de vista do cliente. Usámos técnicas de criação de pro ling do ponto de vista do cliente, técnicas de Machine Learning, entre outros métodos, para demonstrar que o uso de informações do cliente pode servir para melhorar a observabilidade de um sistema distribuído. A inclusão de informações do cliente provou ser um tópico de pesquisa interessante e desa ador. Acreditamos que o nosso trabalho contribuiu para avançar o atual estado da arte de monitoria em sistemas distribuídos. O cliente possui informações viáveis que escapam ao controlo dos administradores e fornece conhecimento importante sobre o sistema.<br>Altice Labs
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!