To see the other types of publications on this topic, follow the link: Explicit constraints.

Dissertations / Theses on the topic 'Explicit constraints'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 18 dissertations / theses for your research on the topic 'Explicit constraints.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Surowiec, Thomas Michael. "Explicit stationarity conditions and solution characterization for equilibrium problems with equilibrium constraints." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät II, 2010. http://dx.doi.org/10.18452/16087.

Full text
Abstract:
Die vorliegende Arbeit beschaeftigt sich mit Gleichgewichtsproblemen unter Gleichgewichtsrestriktionen, sogenannten EPECs (Englisch: Equilibrium Problems with Equilibrium Constraints). Konkret handelt es sich um gekoppelte Zwei-Ebenen-Optimierungsprobleme, bei denen Nash- Gleichgewichte fuer die Entscheidungen der oberen Ebene gesucht sind. Ein Ziel der Arbeit besteht in der Formulierung dualer Stationaritaetsbedingungen zu solchen Problemen. Als Anwendung wird ein oligopolistisches Wettbewerbsmodell fuer Strommaerkte betrachtet. Zur Gewinnung qualitativer Hypothesen ueber die Struktur der betrachteten Modelle (z.B. Inaktivitaet bestimmter Marktteilnehmer) aber auch fuer moegliche numerische Zugaenge ist es wesentlich, EPEC-Loesungen explizit bezueglich der Eingangsdaten des Problems zu formulieren. Der Weg dorthin erfordert eine Strukturanalyse der involvierten Optimierungsprobleme (constraint qualifications, Regularitaet), die Herleitung von Stabilitaetsresultaten bestimmter mengenwertiger Abbildungen und die Nutzung von Transformationsformeln fuer die sogenannte Ko-Ableitung. Weitere Schwerpunkte befassen sich mit der Beziehung zwischen verschiedenen dualen Stationaritaetstypen (S- und M-Stationaritaet) sowie mit stochastischen Erweiterungen der betrachteten Problemklasse, sogenannten SEPECs.
This thesis is concerned with equilibrium problems with equilibrium constraints or EPECs. Concretely, we consider models composed by coupling together two-level optimization problems, the upper-level solutions to which are non-cooperative (Nash-Cournot) equilibria. One of the main goals of the thesis involves the formulation of dual stationarity conditions to EPECs. A model of oligopolistic competition for electricity markets is considered as an application. In order to profit from qualitative hypotheses concerning the structure of the considered models, e.g., inactivity of certain market participants at equilibrium, as well as to provide conditions useful for numerical procedures, the ablilty to formulate EPEC solutions in relation to the input data of the problem is of considerable importance. The way to do this requires a structural analysis of the involved optimization problems, e.g., constraints qualifications, regularity; the derivation of stability results for certain multivalued mappings, and the usage of transformation formulae for so-called coderivatives. Further important topics address the relationship between various dual stationarity types, e.g., S- and M-stationarity, as well as the extension of the considered problem classes to a stochastic setting, i.e., stochastic EPECs or SEPECs.
APA, Harvard, Vancouver, ISO, and other styles
2

Vonwirth, Christian [Verfasser], and Jörn [Akademischer Betreuer] Sass. "Continuous-Time Portfolio Optimization under Partial Information and Convex Constraints: Deriving Explicit Results / Christian Vonwirth ; Betreuer: Jörn Sass." Kaiserslautern : Technische Universität Kaiserslautern, 2017. http://d-nb.info/1137206500/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Nguyen, Hoai Nam. "Constrained control for uncertain systems : an interpolation based control approach." Thesis, Supélec, 2012. http://www.theses.fr/2012SUPL0014/document.

Full text
Abstract:
Un problème fondamental à résoudre en Automatique réside dans la commande des systèmes incertains qui présentent des contraintes sur les variables de l’entrée, de l’état ou la sortie. Ce problème peut être théoriquement résolu au moyen d’une commande optimale. Cependant la commande optimale par principe n’est pas une commande par retour d’état ou retour de sortie et offre seulement une trajectoire optimale le plus souvent par le biais d’une solution numérique.Par conséquent, dans la pratique, le problème peut être approché par de nombreuses méthodes, tels que”commande over-ride” et ”anti-windup”. Une autre solution, devenu populaire au cours des dernières décennies est la commande prédictive. Selon cette méthode, un problème de la commande optimale est résolu à chaque instant d’échantillonnage, et le composant du vecteur de commande destiné à l’échelon curant est appliquée. En dépit de la montée en puissance des architecture de calcul temps-réel, la commande prédictive est à l’heure actuelle principalement approprié lorsque l’ordre est faible, bien connu, et souvent pour des systèmes linéaires. La version robuste de la commande prédictive est conservatrice et compliquée à mettre en œuvre, tandis que la version explicite de la commande prédictive donnant une solution affine par morceaux implique une compartimentation de l’état-espace en cellules polyédrales, très compliquée.Dans cette thèse, une solution élégante et peu coûteuse en temps de calcul est présentée pour des systèmes linéaire, variant dans le temps ou incertains. Les développements se concentre sur les dynamiques en temps discret avec contraintes polyédriques sur l’entrée et l’état (ou la sortie) des vecteurs, dont les perturbations sont bornées. Cette solution est basée sur l’interpolation entre un correcteur pour la région extérieure qui respecte les contraintes sur l’entrée et de l’état, et un autre pour la région intérieure, ce dernier plus agressif, conçue par n’importe quelle méthode classique, ayant un ensemble robuste positivement invariant associé à l’intérieur des contraintes. Une simple fonction de Lyapunov est utilisée afin d’apporter la preuve de la stabilité en boucle fermée
A fundamental problem in automatic control is the control of uncertain plants in the presence of input and state or output constraints. An elegant and theoretically most satisfying framework is represented by optimal control policies which, however, rarely gives an analytical feedback solution, and oftentimes builds on numerical solutions (approximations).Therefore, in practice, the problem has seen many ad-hoc solutions, such as override control, anti-windup, as well as modern techniques developed during the last decades usually based on state space models. One of the popular example is Model Predictive Control (MPC) where an optimal control problem is solved at each sampling instant, and the element of the control vector meant for the nearest sampling interval is applied. In spite of the increased computational power of control computers, MPC is at present mainly suitable for low-order, nominally linear systems. The robust version of MPC is conservative and computationally complicated, while the explicit version of MPC that gives an affine state feedback solution involves a very complicated division of the state space into polyhedral cells.In this thesis a novel and computationally cheap solution is presented for linear, time-varying or uncertain, discrete-time systems with polytopic bounded control and state (or output) vectors, with bounded disturbances. The approach is based on the interpolation between a stabilizing, outer controller that respects the control and state constraints, and an inner, more aggressive controller, designed by any method that has a robustly positively invariant set within the constraints. A simple Lyapunov function is used for the proof of closed loop stability.In contrast to MPC, the new interpolation based controller is not necessarily employing an optimization criterion inspired by performance. In its explicit form, the cell partitioning is simpler that the MPC counterpart. For the implicit version, the on-line computational demand can be restricted to the solution of one linear program or quadratic program. Several simulation examples are given, including uncertain linear systems with output feedback and disturbances. Some examples are compared with MPC. The control of a laboratory ball-and-plate system is also demonstrated. It is believed that the new controller might see wide-spread use in industry, including the automotive industry, also for the control of fast, high-order systems with constraints
APA, Harvard, Vancouver, ISO, and other styles
4

Appleget, Jeffrey A. "Knapsack cuts and explicit-constraint branching for solving integer programs." Thesis, Monterey, California. Naval Postgraduate School, 1997. http://hdl.handle.net/10945/8601.

Full text
Abstract:
Approved for public release; distribution is unlimited
Enhanced solution techniques are developed for solving integer programs (IPs) and mixed-integer programs (MIPs). Previously unsolvable problems can be solved with these new techniques. We develop knapsack cut-finding procedures for minimal cover cuts, and convert existing cut-strengthening theory into practical procedures that lift and tighten violated minimal cover valid inequalities to violated knapsack facets in polynomial time. We find a new class of knapsack cuts called 'non-minimal cover cuts' and a method of lifting them called 'deficit lifting.' Deficit lifting enables all of these cuts to be lifted and tightened to facets as well. Extensions of these techniques enable us to find cuts for elastic knapsack constraints and cuts for non-standard knapsack constraints. We also develop the new technique of 'explicit-constraint branching' (ECB). ECB enables the technique of constraint branching to be used on IPs and MIPs that do not have the structure required for known 'implicit constraint branching' techniques. When these techniques are applied to 84 randomly generated generalized assignment problems, the combination of knapsack cuts and explicit-constraint branching were able to solve 100% of the problems in under 1000 CPU seconds. Explicit constraint branching alone solved 94%, and knapsack cuts solved 93%. Standard branch and bound alone solved only 38%. The benefits of these techniques are also demonstrated on some real-world generalized assignment and set-partitioning problems
APA, Harvard, Vancouver, ISO, and other styles
5

Skrede, Ole-Johan. "Explicit, A Priori Constrained Model Parameterization for Inverse Problems, Applied on Geophysical CSEM Data." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag, 2014. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-27343.

Full text
Abstract:
This thesis introduce a new parameterization of the model space in global inversion problems. The parameterization provides an explicit representation of the model space with a basis constrained on a priori information about the problem at hand. It is able to represent complex model structures with few parameters, and thereby enhancing the speed of the inversion, as the number of iterations needed to converge is heavily scaled with the number of parameters in stochastic, global inversion methods. A standard Simulated Annealing optimization routine is implemented, and further extended to be able to optimize for a dynamically varying number of variables. The method is applied on inversion of marine CSEM data, and inverts both synthetic and real data sets and is able to recover resistivity profiles that demonstrate good resemblance with provided well bore log data. The trans-dimensional, self-parameterizing Simulated Annealing algorithm which is introduced in this thesis proves to be superior to the regular algorithm with fixed parameter dimensions.
APA, Harvard, Vancouver, ISO, and other styles
6

Nguyen, Ngoc Anh. "Explicit robust constrained control for linear systems : analysis, implementation and design based on optimization." Thesis, Université Paris-Saclay (ComUE), 2015. http://www.theses.fr/2015SACLC012/document.

Full text
Abstract:
Les lois de commande affines par morceaux ont attiré une grande attention de la communauté d'automatique de contrôle grâce à leur pertinence pour des systèmes contraints, systèmes hybrides; également pour l'approximation de commandes nonlinéaires. Pourtant, leur mise en oeuvre est soumise à quelques difficultés. Motivé par l'intérêt à cette classe de commandes, cette thèse porte sur leur analyse, mise en oeuvre et synthèse.La première partie de cette thèse a pour but le calcul de la marge de robustesse et de la marge de fragilité pour une loi de commande affine par morceaux donnée et un système linéaire discret. Plus précisément, la marge de robustesse est définie comme l'ensemble des systèmes linéaires à paramètres variants que la loi de commande donnée garde les trajectoires dans de la région faisable. D'ailleurs, la marge de fragilité comprend toutes les variations des coefficients de la commande donnée telle que l'invariance de la région faisable soit encore garantie. Il est montré que si la région faisable donnée est un polytope, ces marges sont aussi des polytopes.La deuxième partie de ce manuscrit est consacrée au problème de l'optimalité inverse pour la classe des fonctions affines par morceaux. C'est-à-dire, l'objective est de définir un problème d'optimisation pour lequel la solution optimale est équivalente à la fonction affine par morceaux donnée. La méthodologie est fondée sur le convex lifting, i.e., un variable auxiliaire, scalaire, qui permet de définir un ensemble convex à partir de la partition d'état de la fonction affine par morceaux donnée. Il est montré que si la fonction affine par morceaux donnée est continue, la solution optimale de ce problème redéfini sera unique. Par contre, si la continuité n'est pas satisfaite, cette fonction affine par morceaux sera une solution optimale parmi les autres du problème redéfini.En ce qui concerne l’application dans la commande prédictive, il sera montré que n'importe quelle loi de commande affine par morceaux continue peut être obtenue par un autre problème de commande prédictive avec l'horizon de prédiction au plus égal à 2. A côté de cet aspect théorique, ce résultat sera utile pour faciliter la mise en oeuvre des lois de commandes affines par morceaux en évitant l'enregistrement de la partition de l'espace d'état. Dans la dernière partie de ce rapport, une famille de convex liftings servira comme des fonctions de Lyapunov. En conséquence, ce "convex lifting" sera déployé pour synthétiser des lois de commande robustes pour des systèmes linéaires incertains, également en présence de perturbations additives bornées. Des lois implicites et explicites seront obtenues en même temps. Cette méthode permet de garantir la faisabilité récursive et la stabilité robuste. Cependant, cette fonction de Lyapunov est limitée à l'ensemble λ −contractive maximal avec une constante scalaire 0 ≤ λ < 1 qui est plus petit que l'ensemble contrôlable maximal. Pour cette raison, une extension de cette méthode pour l'ensemble contrôlable de N − pas, sera présentée. Cette méthode est fondée sur des convex liftings en cascade où une variable auxiliaire sera utilisée pour servir comme une fonction de Lyapunov. Plus précisément, cette variable est non-négative, strictement décroissante pour les N premiers pas et égale toujours à 0 − après. Par conséquent, la stabilité robuste est garantie
Piecewise affine (PWA) feedback control laws have received significant attention due to their relevance for the control of constrained systems, hybrid systems; equally for the approximation of nonlinear control. However, they are associated with serious implementation issues. Motivated from the interest in this class of particular controllers, this thesis is mostly related to their analysis and design.The first part of this thesis aims to compute the robustness and fragility margins for a given PWA control law and a linear discrete-time system. More precisely, the robustness margin is defined as the set of linear time-varying systems such that the given PWA control law keeps the trajectories inside a given feasible set. On a different perspective, the fragility margin contains all the admissible variations of the control law coefficients such that the positive invariance of the given feasible set is still guaranteed. It will be shown that if the given feasible set is a polytope, then so are these robustness/fragility margins.The second part of this thesis focuses on inverse optimality problem for the class of PWA controllers. Namely, the goal is to construct an optimization problem whose optimal solution is equivalent to the given PWA function. The methodology is based on emph convex lifting: an auxiliary 1− dimensional variable which enhances the convexity characterization into recovered optimization problem. Accordingly, if the given PWA function is continuous, the optimal solution to this reconstructed optimization problem will be shown to be unique. Otherwise, if the continuity of this given PWA function is not fulfilled, this function will be shown to be one optimal solution to the recovered problem.In view of applications in linear model predictive control (MPC), it will be shown that any continuous PWA control law can be obtained by a linear MPC problem with the prediction horizon at most equal to 2 prediction steps. Aside from the theoretical meaning, this result can also be of help to facilitate implementation of PWA control laws by avoiding storing state space partition. Another utility of convex liftings will be shown in the last part of this thesis to be a control Lyapunov function. Accordingly, this convex lifting will be deployed in the so-called robust control design based on convex liftings for linear system affected by bounded additive disturbances and polytopic uncertainties. Both implicit and explicit controllers can be obtained. This method can also guarantee the recursive feasibility and robust stability. However, this control Lyapunov function is only defined over the maximal λ −contractive set for a given 0 ≤ λ < 1 which is known to be smaller than the maximal controllable set. Therefore, an extension of the above method to the N-steps controllable set will be presented. This method is based on a cascade of convex liftings where an auxiliary variable will be used to emulate a Lyapunov function. Namely, this variable will be shown to be non-negative, to strictly decrease for N first steps and to stay at 0 afterwards. Accordingly, robust stability is sought
APA, Harvard, Vancouver, ISO, and other styles
7

Nicotra, Marco. "Constrained Control of Nonlinear Systems: The Explicit Reference Governor and its Application to Unmanned Aerial Vehicles." Doctoral thesis, Universite Libre de Bruxelles, 2016. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/235608.

Full text
Abstract:
This dissertation introduces the Explicit Reference Governor: a simple and systematic add-on control unit that provides constraint handling capabilities to any pre-stabilized nonlinear system by suitably manipulating its applied reference. The main innovation of the proposed framework is that constraint satisfaction is ensured without having to solve implicit equations. As a result, the Explicit Reference Governor is particularly well suited for applications with limited computational capabilities. The basic idea behind the scheme consists in manipulating the derivative of the applied reference so that, at any given time instant, the currently applied reference will not cause a violation of constraints anytime in the future. The theory behind the proposed framework is presented in general terms and is then detailed to provide specific design strategies. Possible extensions to ensure robustness are also proposed. In addition to introducing the general theory of the Explicit Reference Governor, the dissertation illustrates its step-by-step implementation on Unmanned Aerial Vehicles.
Doctorat en Sciences de l'ingénieur et technologie
info:eu-repo/semantics/nonPublished
APA, Harvard, Vancouver, ISO, and other styles
8

Polowinski, Jan. "Ontology-Driven, Guided Visualisation Supporting Explicit and Composable Mappings." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2017. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-229908.

Full text
Abstract:
Data masses on the World Wide Web can hardly be managed by humans or machines. One option is the formal description and linking of data sources using Semantic Web and Linked Data technologies. Ontologies written in standardised languages foster the sharing and linking of data as they provide a means to formally define concepts and relations between these concepts. A second option is visualisation. The visual representation allows humans to perceive information more directly, using the highly developed visual sense. Relatively few efforts have been made on combining both options, although the formality and rich semantics of ontological data make it an ideal candidate for visualisation. Advanced visualisation design systems support the visualisation of tabular, typically statistical data. However, visualisations of ontological data still have to be created manually, since automated solutions are often limited to generic lists or node-link diagrams. Also, the semantics of ontological data are not exploited for guiding users through visualisation tasks. Finally, once a good visualisation setting has been created, it cannot easily be reused and shared. Trying to tackle these problems, we had to answer how to define composable and shareable mappings from ontological data to visual means and how to guide the visual mapping of ontological data. We present an approach that allows for the guided visualisation of ontological data, the creation of effective graphics and the reuse of visualisation settings. Instead of generic graphics, we aim at tailor-made graphics, produced using the whole palette of visual means in a flexible, bottom-up approach. It not only allows for visualising ontologies, but uses ontologies to guide users when visualising data and to drive the visualisation process at various places: First, as a rich source of information on data characteristics, second, as a means to formally describe the vocabulary for building abstract graphics, and third, as a knowledge base of facts on visualisation. This is why we call our approach ontology-driven. We suggest generating an Abstract Visual Model (AVM) to represent and »synthesise« a graphic following a role-based approach, inspired by the one used by J. v. Engelhardt for the analysis of graphics. It consists of graphic objects and relations formalised in the Visualisation Ontology (VISO). A mappings model, based on the declarative RDFS/OWL Visualisation Language (RVL), determines a set of transformations from the domain data to the AVM. RVL allows for composable visual mappings that can be shared and reused across platforms. To guide the user, for example, we discourage the construction of mappings that are suboptimal according to an effectiveness ranking formalised in the fact base and suggest more effective mappings instead. The guidance process is flexible, since it is based on exchangeable rules. VISO, RVL and the AVM are additional contributions of this thesis. Further, we initially analysed the state of the art in visualisation and RDF-presentation comparing 10 approaches by 29 criteria. Our approach is unique because it combines ontology-driven guidance with composable visual mappings. Finally, we compare three prototypes covering the essential parts of our approach to show its feasibility. We show how the mapping process can be supported by tools displaying warning messages for non-optimal visual mappings, e.g., by considering relation characteristics such as »symmetry«. In a constructive evaluation, we challenge both the RVL language and the latest prototype trying to regenerate sketches of graphics we created manually during analysis. We demonstrate how graphics can be varied and complex mappings can be composed from simple ones. Two thirds of the sketches can be almost or completely specified and half of them can be almost or completely implemented
Datenmassen im World Wide Web können kaum von Menschen oder Maschinen erfasst werden. Eine Option ist die formale Beschreibung und Verknüpfung von Datenquellen mit Semantic-Web- und Linked-Data-Technologien. Ontologien, in standardisierten Sprachen geschrieben, befördern das Teilen und Verknüpfen von Daten, da sie ein Mittel zur formalen Definition von Konzepten und Beziehungen zwischen diesen Konzepten darstellen. Eine zweite Option ist die Visualisierung. Die visuelle Repräsentation ermöglicht es dem Menschen, Informationen direkter wahrzunehmen, indem er seinen hochentwickelten Sehsinn verwendet. Relativ wenige Anstrengungen wurden unternommen, um beide Optionen zu kombinieren, obwohl die Formalität und die reichhaltige Semantik ontologische Daten zu einem idealen Kandidaten für die Visualisierung machen. Visualisierungsdesignsysteme unterstützen Nutzer bei der Visualisierung von tabellarischen, typischerweise statistischen Daten. Visualisierungen ontologischer Daten jedoch müssen noch manuell erstellt werden, da automatisierte Lösungen häufig auf generische Listendarstellungen oder Knoten-Kanten-Diagramme beschränkt sind. Auch die Semantik der ontologischen Daten wird nicht ausgenutzt, um Benutzer durch Visualisierungsaufgaben zu führen. Einmal erstellte Visualisierungseinstellungen können nicht einfach wiederverwendet und geteilt werden. Um diese Probleme zu lösen, mussten wir eine Antwort darauf finden, wie die Definition komponierbarer und wiederverwendbarer Abbildungen von ontologischen Daten auf visuelle Mittel geschehen könnte und wie Nutzer bei dieser Abbildung geführt werden könnten. Wir stellen einen Ansatz vor, der die geführte Visualisierung von ontologischen Daten, die Erstellung effektiver Grafiken und die Wiederverwendung von Visualisierungseinstellungen ermöglicht. Statt auf generische Grafiken zielt der Ansatz auf maßgeschneiderte Grafiken ab, die mit der gesamten Palette visueller Mittel in einem flexiblen Bottom-Up-Ansatz erstellt werden. Er erlaubt nicht nur die Visualisierung von Ontologien, sondern verwendet auch Ontologien, um Benutzer bei der Visualisierung von Daten zu führen und den Visualisierungsprozess an verschiedenen Stellen zu steuern: Erstens als eine reichhaltige Informationsquelle zu Datencharakteristiken, zweitens als Mittel zur formalen Beschreibung des Vokabulars für den Aufbau von abstrakten Grafiken und drittens als Wissensbasis von Visualisierungsfakten. Deshalb nennen wir unseren Ansatz ontologie-getrieben. Wir schlagen vor, ein Abstract Visual Model (AVM) zu generieren, um eine Grafik rollenbasiert zu synthetisieren, angelehnt an einen Ansatz der von J. v. Engelhardt verwendet wird, um Grafiken zu analysieren. Das AVM besteht aus grafischen Objekten und Relationen, die in der Visualisation Ontology (VISO) formalisiert sind. Ein Mapping-Modell, das auf der deklarativen RDFS/OWL Visualisation Language (RVL) basiert, bestimmt eine Menge von Transformationen von den Quelldaten zum AVM. RVL ermöglicht zusammensetzbare »Mappings«, visuelle Abbildungen, die über Plattformen hinweg geteilt und wiederverwendet werden können. Um den Benutzer zu führen, bewerten wir Mappings anhand eines in der Faktenbasis formalisierten Effektivitätsrankings und schlagen ggf. effektivere Mappings vor. Der Beratungsprozess ist flexibel, da er auf austauschbaren Regeln basiert. VISO, RVL und das AVM sind weitere Beiträge dieser Arbeit. Darüber hinaus analysieren wir zunächst den Stand der Technik in der Visualisierung und RDF-Präsentation, indem wir 10 Ansätze nach 29 Kriterien vergleichen. Unser Ansatz ist einzigartig, da er eine ontologie-getriebene Nutzerführung mit komponierbaren visuellen Mappings vereint. Schließlich vergleichen wir drei Prototypen, welche die wesentlichen Teile unseres Ansatzes umsetzen, um seine Machbarkeit zu zeigen. Wir zeigen, wie der Mapping-Prozess durch Tools unterstützt werden kann, die Warnmeldungen für nicht optimale visuelle Abbildungen anzeigen, z. B. durch Berücksichtigung von Charakteristiken der Relationen wie »Symmetrie«. In einer konstruktiven Evaluation fordern wir sowohl die RVL-Sprache als auch den neuesten Prototyp heraus, indem wir versuchen Skizzen von Grafiken umzusetzen, die wir während der Analyse manuell erstellt haben. Wir zeigen, wie Grafiken variiert werden können und komplexe Mappings aus einfachen zusammengesetzt werden können. Zwei Drittel der Skizzen können fast vollständig oder vollständig spezifiziert werden und die Hälfte kann fast vollständig oder vollständig umgesetzt werden
APA, Harvard, Vancouver, ISO, and other styles
9

Goldar, Davila Alejandro. "Low-complexity algorithms for the fast and safe charge of Li-ion batteries." Doctoral thesis, Universite Libre de Bruxelles, 2021. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/320077.

Full text
Abstract:
This thesis proposes, validates, and compares low-complexity algorithms for the fast-and-safe charge and balance of Li-ion batteries both for the single cell case and for the case of a serially-connected string of battery cells. The proposed algorithms are based on a reduced-order electrochemical model (Equivalent Hydraulic Model, EHM), and make use of constrained-control strategies to limit the main electrochemical degradation phenomena that may accelerate aging, namely: Lithium plating in the anode and solvent oxidation inthe cathode. To avoid the computational intensiveness of solving an online optimization as in the Model Predictive Control (MPC) framework, this thesis proposes the use of Reference Governor schemes. Variants of both the Scalar Reference Governors (SRG) and the Explicit Reference Governors (ERG) are developed to deal with the non-convex admissible region for the charge of a battery cell, while keeping a low computational burden. To evaluate the performance of the proposed techniques for the single cell case, they are experimentallyvalidated on commercial Turnigy LCO cells of 160 mAh at four different constant temperatures (10, 20, 30 and 40 °C). In the second part of this thesis, the proposed charging strategies are extended to take into account the balance of a serially-connected string of cells. To equalize possible mismatches, a centralized policy based on a shunting grid (active balance) connects or disconnects the cells during the charge. After a preliminary analysis, a simple mixed-integer algorithm was proposed. Since this method is computationally inefficient due to the high number of scenarios to be evaluated, this thesis proposes a ratio-based algorithm based on a Pulse-Width Modulation (PWM) approach. This approach can be used within both MPC and RG schemes. The numerical validations of the proposed algorithms for the case of a string of four battery cells are carried out using a simulator based on a full-order electrochemical model. Numerical validations show that the PWM-like approach charges in parallel all the cells within the pack, whereas the mixed-integer approach charges the battery cells sequentially from the battery cell with the lowest state of charge to the ones with the highest states of charge. On the basis of the simulations, an algorithm based on a mixed logic that allows to charge in a “sequential parallel” approach is proposed. Some conclusions and future directions of research are proposed at the end of the thesis.
Doctorat en Sciences de l'ingénieur et technologie
info:eu-repo/semantics/nonPublished
APA, Harvard, Vancouver, ISO, and other styles
10

Nguyen, Thanh Dat. "Aide à la validation temporelle et au dimensionnement de systèmes temps réels dans une démarche dirigée par modèles PARAD Repository: On the Capitalization of the Performance Analysis Process for AADL Designs Design ans analysis of semaphore precedence constraints Towards a model-based framework for prototyping performance analysis tests Towards a Descriptive Language to Explicitly Define the Applicability of Timing Verification Tests of Critical Real-Time Systems." Thesis, Chasseneuil-du-Poitou, Ecole nationale supérieure de mécanique et d'aérotechnique, 2020. http://www.theses.fr/2020ESMA0007.

Full text
Abstract:
Les systèmes embarqués temps réel sont de plus en plus omniprésents dans la vie quotidienne. Le cycle de développement des systèmes embarqués temps réel critiques peut prendre des mois voire des années. Par conséquent, la modélisation de ces systèmes doit être analysée à un stade précoce du cycle de développement afin de vérifier si toutes les exigences sont satisfaites, y compris les exigences relatives à la performance temporelle (par exemple, les latences, les délais de bout en bout, etc.). Cette thèse, qui a été financée dans le cadre d’un projet FUI, propose trois contributions majeures. La première contribution porte sur le système de tâches monoprocesseur avec des relations de communication multi-périodique déterministe. Un pattern basé sur les relations de précédence avec sémaphore (SPC) a été étendu afin de supporter les cycles dans le cas d’ordonnancement à priorités dynamiques. Un graphe de dépliage a été également proposé afin de présenter la cyclicité des systèmes à base de SPC et garantir le non- blocage. Le pattern SPC étendu ainsi que le test d’ordonnançabilité correspondant ont été implémentés pour le langage standard AADL.La deuxième contribution de cette thèse consiste en la proposition d’un calcul exact de temps de réponse de bout en bout, en utilisant le formalisme de réseau de Petri temporel, pour les systèmes multiprocesseurs identiques. Il prend en compte la dépendance entre les tâches : la précédence et l’exclusion mutuelle sans protocole de gestion. La troisième contribution porte sur la capitalisation des efforts du processus d’analyse. En effet, de nombreux tests d’analyse ont été proposés, notamment par des chercheurs académiques, basés sur la théorie d’ordonnancement et dédiés aux différentes architectures logicielles et matérielles. Cependant, l’une des principales difficultés rencontrées par les concepteurs est de choisir le test d’analyse le plus approprié permettant de valider et/ou de dimensionner correctement leurs conceptions. Cette difficulté se concrétise, dans le milieu industriel, par le peu de tests d’analyse utilisés malgré la multitude de tests proposés. Cette thèse vise donc à faciliter le processus d’analyse, réduire le fossé sémantique entre le modèle métier et les entrées des tests d’analyse ainsi qu’accélérer le transfert technologique et l’adoption des tests académiques. La thèse propose un référentiel d’analyse jouant le rôle d’un dictionnaire de tests, leurs contextes pour une utilisation correcte, les outils les implémentant, ainsi qu’un mécanisme pour le choix des tests selon le modèle métier d’entrée
Real-time embedded systems are increasingly omnipresent in everyday life. The development cycle of critical systems can take months or even years. Therefore, modeling of these systems should be analyzed at an early stage in the development cycle to verify that all requirements are met, including temporal requirements (e.g., latencies, end-to-end delays). This thesis, which was funded as part of FUI project, offers three major contributions. The first contribution relates to mono-processor task system with deterministic multi-periodic communication relationships. A pattern based on Semaphore Precedence Constraint (SPC) has been extended to support cycles in the case of dynamic priority scheduling. An unfolding graph has also been proposed in order to present the cyclicity of SPC-based systems and guarantee deadlock free. The extended SPC pattern and the corresponding scheduling analysis tests have been implemented for the standard AADL language. The second contribution of this thesis consists in proposing an exact calculation of end-to-end response time using the time Petri net formalism for identical multiprocessor systems. It takes into account the dependence between the tasks : precedence and mutual exclusion. The third contribution concerns the capitalization of the efforts of the analysis process. Indeed, many analytical tests have been proposed, notably by academic researchers, based on scheduling theory and dedicated to the different software and hardware architectures. However, one of the main difficulties encountered by designers is to choose the most appropriate analysis test to validate and/or correctly dimension their designs. In industrial environment, there are few analytical tests used despite the multitude of the tests offered. This thesis therefore aims to facilitate the analysis process, reduce the semantic gap between the business model and the entries in the analysis tests as well as accelerate technology transfer and the adoption of academic tests. The thesis proposes an analysis repository playing the role of a dictionary of tests, their contexts for correct use, the tools implementing them, as well as a mechanism for choosing tests according to the input business model
APA, Harvard, Vancouver, ISO, and other styles
11

(10141679), Haoyang Zheng. "Quantifying implicit and explicit constraints on physics-informed neural processes." Thesis, 2021.

Find full text
Abstract:

Due to strong interactions among various phases and among the phases and fluid motions, multiphase flows (MPFs) are so complex that lots of efforts have to be paid to predict its sequential patterns of phases and motions. The present paper applies the physical constraints inherent in MPFs and enforces them to a physics-informed neural network (PINN) model either explicitly or implicitly, depending on the type of constraints. To predict the unobserved order parameters (OPs) (which locate the phases) in the future steps, the conditional neural processes (CNPs) with long short-term memory (LSTM, combined as CNPLSTM) are applied to quickly infer the dynamics of the phases after encoding only a few observations. After that, the multiphase consistent and conservative boundedness mapping (MCBOM) algorithm is implemented the correction the predicted OPs from CNP-LSTM so that the mass conservation, the summation of the volume fractions of the phases being unity, the consistency of reduction, and the boundedness of the OPs are strictly satisfied. Next, the density of the fluid mixture is computed from the corrected OPs. The observed velocity and density of the fluid mixture then encode in a physics-informed conditional neural processes and long short-term memory (PICNP-LSTM) where the constraint of momentum conservation is included in the loss function. Finally, the unobserved velocity in future steps is predicted from PICNP-LSTM. The proposed physics-informed neural processes (PINPs) model (CNP-LSTM-MCBOM-PICNP-LSTM) for MPFs avoids unphysical behaviors of the OPs, accelerates the convergence, and requires fewer data. The proposed model successfully predicts several canonical MPF problems, i.e., the horizontal shear layer (HSL) and dam break (DB) problems, and its performances are validated.

APA, Harvard, Vancouver, ISO, and other styles
12

"Quadratic Optimization with Orthogonality Constraints: Explicit Lojasiewicz Exponent and Linear Convergence." 2016. http://repository.lib.cuhk.edu.hk/en/item/cuhk-1292697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Surowiec, Thomas Michael [Verfasser]. "Explicit stationarity conditions and solution characterization for equilibrium problems with equilibrium constraints / von Thomas Michael Surowiec." 2010. http://d-nb.info/1004358318/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Cho, Donghyurn. "Successive Backward Sweep Methods for Optimal Control of Nonlinear Systems with Constraints." Thesis, 2013. http://hdl.handle.net/1969.1/150960.

Full text
Abstract:
Continuous and discrete-time Successive Backward Sweep (SBS) methods for solving nonlinear optimal control problems involving terminal and control constraints are proposed in this dissertation. They closely resemble the Neighboring Extremals and Differential Dynamic Programming algorithms, which are based on the successive solutions to a series of linear control problems with quadratic performance indices. The SBS methods are relatively insensitive to the initial guesses of the state and control histories, which are not required to satisfy the system dynamics. Hessian modifications are utilized, especially for non-convex problems, to avoid singularities during the backward integration of the gain equations. The SBS method requires the satisfaction of the Jacobi no-conjugate point condition and hence, produces optimal solutions. The standard implementation of the SBS method for continuous-time systems incurs terminal boundary condition errors due to an algorithmic singularity as well as numerical inaccuracies in the computation of the gain matrices. Alternatives for boundary error reduction are proposed, notably the aiming point and the switching between two forms of the sweep expansion formulae. Modification of the sweep formula expands the domain of convergence of the SBS method and allows for a rigorous testing for the existence of conjugate points. Numerical accuracy of the continuous-time formulation of the optimal control problem can be improved with the use of symplectic integrators, which generally are implicit schemes in time. A time-explicit group preserving method based on the Magnus series representation of the state transition is implemented in the SBS setting and is shown to outperform a non-symplectic integrator of the same order. Discrete-time formulations of the optimal control problem, directly accounting for a specific time-stepping method, lead to consistent systems of equations, whose solutions satisfy the boundary conditions of the discretized problem accurately. In this regard, the second-order, implicit mid-point averaging scheme, a symplectic integrator, is adapted for use with the SBS method. The performance of the mid-point averaging scheme is compared with other methods of equal and higher-order non-symplectic schemes to show its advantages. The SBS method is augmented with a homotopy- continuation procedure to isolate and regulate certain nonlinear effects for difficult problems, in order to extend its domain of convergence. The discrete-time SBS method is also extended to solve problems where the controls are approximated to be impulsive and to handle waypoint constraints as well. A variety of highly nonlinear optimal control problems involving orbit transfer, atmospheric reentry, and the restricted three-body problem are treated to demonstrate the performance of the methods developed in this dissertation.
APA, Harvard, Vancouver, ISO, and other styles
15

Tu, Xi. "Image representation with explicit discontinuities using triangle meshes." Thesis, 2012. http://hdl.handle.net/1828/4264.

Full text
Abstract:
Triangle meshes can provide an effective geometric representation of images. Although many mesh generation methods have been proposed to date, many of them do not explicitly take image discontinuities into consideration. In this thesis, a new mesh model for images, which explicitly represents discontinuities (i.e., image edges), is proposed along with two corresponding mesh-generation methods that determine the mesh-model parameters for a given input image. The mesh model is based on constrained Delaunay triangulations (DTs), where the constrained edges correspond to image edges. One of the proposed methods is named explicitly-represented discontinuities-with error diffusion (ERDED), and is fast and easy to implement. In the ERDED method, the error diffusion (ED) scheme is employed to select a subset of sample points that are not on the constrained edges. The other proposed method is called ERDGPI. In the ERDGPI method, a constrained DT is first constructed with a set of prespecified constrained edges. Then, the greedy point insertion (GPI) scheme is employed to insert one point into the constrained DT in each iteration until a certain number of points is reached. The ERDED and ERDGPI methods involve several parameters which must be provided as input. These parameters can affect the quality of the resulting image approximations, and are discussed in detail. We also evaluate the performance of our proposed ERDED and ERDGPI methods by comparing them with the highly effective ED and GPI schemes. Our proposed methods are demonstrated to be capable of producing image approximations of higher quality both in terms of PSNR and subjective quality than those generated by other schemes. For example, the reconstructed images produced by the proposed ERDED method are often about 3.77 dB higher in PSNR than those produced by the ED scheme, and our proposed ERDGPI scheme produces image approximations of about 1.08 dB higher PSNR than those generated by the GPI approach.
Graduate
APA, Harvard, Vancouver, ISO, and other styles
16

Ho, Raymond J. "Development and implementation of nonlinear constraint relations for explicit finite element analysis." 2004. http://link.library.utoronto.ca/eir/EIRdetail.cfm?Resources__ID=81112&T=F.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Cheng, Yu-Ru. "Explicit Compositional State-Space Enumeration with Context Constraint & Counter Examples by Hierarchical Tracing." 2004. http://www.cetd.com.tw/ec/thesisdetail.aspx?etdun=U0021-2004200710044448.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Polowinski, Jan. "Ontology-Driven, Guided Visualisation Supporting Explicit and Composable Mappings." Doctoral thesis, 2016. https://tud.qucosa.de/id/qucosa%3A30593.

Full text
Abstract:
Data masses on the World Wide Web can hardly be managed by humans or machines. One option is the formal description and linking of data sources using Semantic Web and Linked Data technologies. Ontologies written in standardised languages foster the sharing and linking of data as they provide a means to formally define concepts and relations between these concepts. A second option is visualisation. The visual representation allows humans to perceive information more directly, using the highly developed visual sense. Relatively few efforts have been made on combining both options, although the formality and rich semantics of ontological data make it an ideal candidate for visualisation. Advanced visualisation design systems support the visualisation of tabular, typically statistical data. However, visualisations of ontological data still have to be created manually, since automated solutions are often limited to generic lists or node-link diagrams. Also, the semantics of ontological data are not exploited for guiding users through visualisation tasks. Finally, once a good visualisation setting has been created, it cannot easily be reused and shared. Trying to tackle these problems, we had to answer how to define composable and shareable mappings from ontological data to visual means and how to guide the visual mapping of ontological data. We present an approach that allows for the guided visualisation of ontological data, the creation of effective graphics and the reuse of visualisation settings. Instead of generic graphics, we aim at tailor-made graphics, produced using the whole palette of visual means in a flexible, bottom-up approach. It not only allows for visualising ontologies, but uses ontologies to guide users when visualising data and to drive the visualisation process at various places: First, as a rich source of information on data characteristics, second, as a means to formally describe the vocabulary for building abstract graphics, and third, as a knowledge base of facts on visualisation. This is why we call our approach ontology-driven. We suggest generating an Abstract Visual Model (AVM) to represent and »synthesise« a graphic following a role-based approach, inspired by the one used by J. v. Engelhardt for the analysis of graphics. It consists of graphic objects and relations formalised in the Visualisation Ontology (VISO). A mappings model, based on the declarative RDFS/OWL Visualisation Language (RVL), determines a set of transformations from the domain data to the AVM. RVL allows for composable visual mappings that can be shared and reused across platforms. To guide the user, for example, we discourage the construction of mappings that are suboptimal according to an effectiveness ranking formalised in the fact base and suggest more effective mappings instead. The guidance process is flexible, since it is based on exchangeable rules. VISO, RVL and the AVM are additional contributions of this thesis. Further, we initially analysed the state of the art in visualisation and RDF-presentation comparing 10 approaches by 29 criteria. Our approach is unique because it combines ontology-driven guidance with composable visual mappings. Finally, we compare three prototypes covering the essential parts of our approach to show its feasibility. We show how the mapping process can be supported by tools displaying warning messages for non-optimal visual mappings, e.g., by considering relation characteristics such as »symmetry«. In a constructive evaluation, we challenge both the RVL language and the latest prototype trying to regenerate sketches of graphics we created manually during analysis. We demonstrate how graphics can be varied and complex mappings can be composed from simple ones. Two thirds of the sketches can be almost or completely specified and half of them can be almost or completely implemented.:Legend and Overview of Prefixes xiii 1 Introduction 1 2 Background 11 2.1 Visualisation 11 2.1.1 What is Visualisation? 11 2.1.2 What are the Benefits of Visualisation? 12 2.1.3 Visualisation Related Terms Used in this Thesis 12 2.1.4 Visualisation Models and Architectural Patterns 12 2.1.5 Visualisation Design Systems 14 2.1.6 What is the Difference between Visual Mapping and Styling? 14 2.1.7 Lessons Learned from Style Sheet Languages 15 2.2 Data 16 2.2.1 Data – Information – Knowledge 17 2.2.2 Structured Data 17 2.2.3 Ontologies in Computer Science 19 2.2.4 The Semantic Web and its Languages 19 2.2.5 Linked Data and Open Data 20 2.2.6 The Metamodelling Technological Space 21 2.2.7 SPIN 21 2.3 Guidance 22 2.3.1 Guidance in Visualisation 22 3 Problem Analysis 23 3.1 Problems of Ontology Visualisation Approaches 24 3.2 Research Questions 25 3.3 Set up of the Case Studies 25 3.3.1 Case Studies in the Life Sciences Domain 26 3.3.2 Case Studies in the Publishing Domain 26 3.3.3 Case Studies in the Software Technology Domain 27 3.4 Analysis of the Case Studies’ Ontologies 27 3.5 Manual Sketching of Graphics 29 3.6 Analysis of the Graphics for Typical Visualisation Cases 29 3.7 Requirements 33 3.7.1 Requirements for Visualisation and Interaction 34 3.7.2 Requirements for Data Awareness 34 3.7.3 Requirements for Reuse and Composition 34 3.7.4 Requirements for Variability 35 3.7.5 Requirements for Tooling Support and Guidance 35 3.7.6 Optional Features and Limitations 36 4 Analysis of the State of the Art 37 4.1 Related Visualisation Approaches 38 4.1.1 Short Overview of the Approaches 38 4.1.2 Detailed Comparison by Criteria 46 4.1.3 Conclusion – What Is Still Missing? 60 4.2 Visualisation Languages 62 4.2.1 Short Overview of the Compared Languages 62 4.2.2 Detailed Comparison by Language Criteria 66 4.2.3 Conclusion – What Is Still Missing? 71 4.3 RDF Presentation Languages 72 4.3.1 Short Overview of the Compared Languages 72 4.3.2 Detailed Comparison by Language Criteria 76 4.3.3 Additional Criteria for RDF Display Languages 87 4.3.4 Conclusion – What Is Still Missing? 89 4.4 Model-Driven Interfaces 90 4.4.1 Metamodel-Driven Interfaces 90 4.4.2 Ontology-Driven Interfaces 92 4.4.3 Combined Usage of the Metamodelling and Ontology Technological Space 94 5 A Visualisation Ontology – VISO 97 5.1 Methodology Used for Ontology Creation 100 5.2 Requirements for a Visualisation Ontology 100 5.3 Existing Approaches to Modelling in the Field of Visualisation 101 5.3.1 Terminologies and Taxonomies 101 5.3.2 Existing Visualisation Ontologies 102 5.3.3 Other Visualisation Models and Approaches to Formalisation 103 5.3.4 Summary 103 5.4 Technical Aspects of VISO 103 5.5 VISO/graphic Module – Graphic Vocabulary 104 5.5.1 Graphic Representations and Graphic Objects 105 5.5.2 Graphic Relations and Syntactic Structures 107 5.6 VISO/data Module – Characterising Data 110 5.6.1 Data Structure and Characteristics of Relations 110 5.6.2 The Scale of Measurement and Units 112 5.6.3 Properties for Characterising Data Variables in Statistical Data 113 5.7 VISO/facts Module – Facts for Vis. Constraints and Rules 115 5.7.1 Expressiveness of Graphic Relations 116 5.7.2 Effectiveness Ranking of Graphic Relations 118 5.7.3 Rules for Composing Graphics 119 5.7.4 Other Rules to Consider for Visual Mapping 124 5.7.5 Providing Named Value Collections 124 5.7.6 Existing Approaches to the Formalisation of Visualisation Knowledge . . 126 5.7.7 The VISO/facts/empiric Example Knowledge Base 126 5.8 Other VISO Modules 126 5.9 Conclusions and Future Work 127 5.10 Further Use Cases for VISO 127 5.11 VISO on the Web – Sharing the Vocabulary to Build a Community 128 6 A VISO-Based Abstract Visual Model – AVM 129 6.1 Graphical Notation Used in this Chapter 129 6.2 Elementary Graphic Objects and Graphic Attributes 131 6.3 N-Ary Relations 131 6.4 Binary Relations 131 6.5 Composition of Graphic Objects Using Roles 132 6.6 Composition of Graphic Relations Using Roles 132 6.7 Composition of Visual Mappings Using the AVM 135 6.8 Tracing 135 6.9 Is it Worth Having an Abstract Visual Model? 135 6.10 Discussion of Fresnel as a Related Language 137 6.11 Related Work 139 6.12 Limitations 139 6.13 Conclusions 140 7 A Language for RDFS/OWL Visualisation – RVL 141 7.1 Language Requirements 142 7.2 Main RVL Constructs 145 7.2.1 Mapping 145 7.2.2 Property Mapping 146 7.2.3 Identity Mapping 146 7.2.4 Value Mapping 147 7.2.5 Inheriting RVL Settings 147 7.2.6 Resource Mapping 148 7.2.7 Simplifications 149 7.3 Calculating Value Mappings 150 7.4 Defining Scale of Measurement 153 7.4.1 Determining the Scale of Measurement 154 7.5 Addressing Values in Value Mappings 156 7.5.1 Determining the Set of Addressed Source Values 156 7.5.2 Determining the Set of Addressed Target Values 157 7.6 Overlapping Value Mappings 158 7.7 Default Value Mapping 158 7.8 Default Labelling 159 7.9 Defining Interaction 159 7.10 Mapping Composition and Submappings 160 7.11 A Schema Language for RVL 160 7.11.1 Concrete Examples of the RVL Schema 163 7.12 Conclusions and Future Work 166 8 The OGVIC Approach 169 8.1 Ontology-Driven, Guided Editing of Visual Mappings 172 8.1.1 Classification of Constraints 172 8.1.2 Levels of Guidance 173 8.1.3 Implementing Constraint-Based Guidance 173 8.2 Support of Explicit and Composable Visual Mappings 177 8.2.1 Mapping Composition Cases 178 8.2.2 Selecting a Context 180 8.2.3 Using the Same Graphic Relation Multiple Times 181 8.3 Prototype P1 (TopBraid-Composer-based) 182 8.4 Prototype P2 (OntoWiki-based) 184 8.5 Prototype P3 (Java Implementation of RVL) 187 8.6 Lessons Learned from Prototypes & Future Work 190 8.6.1 Checking RVL Constraints and Visualisation Rules 190 8.6.2 A User Interface for Editing RVL Mappings 190 8.6.3 Graph Transformations with SPIN and SPARQL 1.1 Update 192 8.6.4 Selection and Filtering of Data 193 8.6.5 Interactivity and Incremental Processing 193 8.6.6 Rendering the Final Platform-Specific Code 196 9 Application 197 9.1 Coverage of Case Study Sketches and Necessary Features 198 9.2 Coverage of Visualisation Cases 201 9.3 Coverage of Requirements 205 9.4 Full Example 206 10 Conclusions 211 10.1 Contributions 211 10.2 Constructive Evaluation 212 10.3 Research Questions 213 10.4 Transfer to Other Models and Constraint Languages 213 10.5 Limitations 214 10.6 Future Work 214 Appendices 217 A Case Study Sketches 219 B VISO – Comparison of Visualisation Literature 229 C RVL 231 D RVL Example Mappings and Application 233 D.1 Listings of RVL Example Mappings as Required by Prototype P3 233 D.2 Features Required for Implementing all Sketches 235 D.3 JSON Format for Processing the AVM with D3 – Hierarchical Variant 238 Bibliography 238 List of Figures 251 List of Tables 254 List of Listings 257
Datenmassen im World Wide Web können kaum von Menschen oder Maschinen erfasst werden. Eine Option ist die formale Beschreibung und Verknüpfung von Datenquellen mit Semantic-Web- und Linked-Data-Technologien. Ontologien, in standardisierten Sprachen geschrieben, befördern das Teilen und Verknüpfen von Daten, da sie ein Mittel zur formalen Definition von Konzepten und Beziehungen zwischen diesen Konzepten darstellen. Eine zweite Option ist die Visualisierung. Die visuelle Repräsentation ermöglicht es dem Menschen, Informationen direkter wahrzunehmen, indem er seinen hochentwickelten Sehsinn verwendet. Relativ wenige Anstrengungen wurden unternommen, um beide Optionen zu kombinieren, obwohl die Formalität und die reichhaltige Semantik ontologische Daten zu einem idealen Kandidaten für die Visualisierung machen. Visualisierungsdesignsysteme unterstützen Nutzer bei der Visualisierung von tabellarischen, typischerweise statistischen Daten. Visualisierungen ontologischer Daten jedoch müssen noch manuell erstellt werden, da automatisierte Lösungen häufig auf generische Listendarstellungen oder Knoten-Kanten-Diagramme beschränkt sind. Auch die Semantik der ontologischen Daten wird nicht ausgenutzt, um Benutzer durch Visualisierungsaufgaben zu führen. Einmal erstellte Visualisierungseinstellungen können nicht einfach wiederverwendet und geteilt werden. Um diese Probleme zu lösen, mussten wir eine Antwort darauf finden, wie die Definition komponierbarer und wiederverwendbarer Abbildungen von ontologischen Daten auf visuelle Mittel geschehen könnte und wie Nutzer bei dieser Abbildung geführt werden könnten. Wir stellen einen Ansatz vor, der die geführte Visualisierung von ontologischen Daten, die Erstellung effektiver Grafiken und die Wiederverwendung von Visualisierungseinstellungen ermöglicht. Statt auf generische Grafiken zielt der Ansatz auf maßgeschneiderte Grafiken ab, die mit der gesamten Palette visueller Mittel in einem flexiblen Bottom-Up-Ansatz erstellt werden. Er erlaubt nicht nur die Visualisierung von Ontologien, sondern verwendet auch Ontologien, um Benutzer bei der Visualisierung von Daten zu führen und den Visualisierungsprozess an verschiedenen Stellen zu steuern: Erstens als eine reichhaltige Informationsquelle zu Datencharakteristiken, zweitens als Mittel zur formalen Beschreibung des Vokabulars für den Aufbau von abstrakten Grafiken und drittens als Wissensbasis von Visualisierungsfakten. Deshalb nennen wir unseren Ansatz ontologie-getrieben. Wir schlagen vor, ein Abstract Visual Model (AVM) zu generieren, um eine Grafik rollenbasiert zu synthetisieren, angelehnt an einen Ansatz der von J. v. Engelhardt verwendet wird, um Grafiken zu analysieren. Das AVM besteht aus grafischen Objekten und Relationen, die in der Visualisation Ontology (VISO) formalisiert sind. Ein Mapping-Modell, das auf der deklarativen RDFS/OWL Visualisation Language (RVL) basiert, bestimmt eine Menge von Transformationen von den Quelldaten zum AVM. RVL ermöglicht zusammensetzbare »Mappings«, visuelle Abbildungen, die über Plattformen hinweg geteilt und wiederverwendet werden können. Um den Benutzer zu führen, bewerten wir Mappings anhand eines in der Faktenbasis formalisierten Effektivitätsrankings und schlagen ggf. effektivere Mappings vor. Der Beratungsprozess ist flexibel, da er auf austauschbaren Regeln basiert. VISO, RVL und das AVM sind weitere Beiträge dieser Arbeit. Darüber hinaus analysieren wir zunächst den Stand der Technik in der Visualisierung und RDF-Präsentation, indem wir 10 Ansätze nach 29 Kriterien vergleichen. Unser Ansatz ist einzigartig, da er eine ontologie-getriebene Nutzerführung mit komponierbaren visuellen Mappings vereint. Schließlich vergleichen wir drei Prototypen, welche die wesentlichen Teile unseres Ansatzes umsetzen, um seine Machbarkeit zu zeigen. Wir zeigen, wie der Mapping-Prozess durch Tools unterstützt werden kann, die Warnmeldungen für nicht optimale visuelle Abbildungen anzeigen, z. B. durch Berücksichtigung von Charakteristiken der Relationen wie »Symmetrie«. In einer konstruktiven Evaluation fordern wir sowohl die RVL-Sprache als auch den neuesten Prototyp heraus, indem wir versuchen Skizzen von Grafiken umzusetzen, die wir während der Analyse manuell erstellt haben. Wir zeigen, wie Grafiken variiert werden können und komplexe Mappings aus einfachen zusammengesetzt werden können. Zwei Drittel der Skizzen können fast vollständig oder vollständig spezifiziert werden und die Hälfte kann fast vollständig oder vollständig umgesetzt werden.:Legend and Overview of Prefixes xiii 1 Introduction 1 2 Background 11 2.1 Visualisation 11 2.1.1 What is Visualisation? 11 2.1.2 What are the Benefits of Visualisation? 12 2.1.3 Visualisation Related Terms Used in this Thesis 12 2.1.4 Visualisation Models and Architectural Patterns 12 2.1.5 Visualisation Design Systems 14 2.1.6 What is the Difference between Visual Mapping and Styling? 14 2.1.7 Lessons Learned from Style Sheet Languages 15 2.2 Data 16 2.2.1 Data – Information – Knowledge 17 2.2.2 Structured Data 17 2.2.3 Ontologies in Computer Science 19 2.2.4 The Semantic Web and its Languages 19 2.2.5 Linked Data and Open Data 20 2.2.6 The Metamodelling Technological Space 21 2.2.7 SPIN 21 2.3 Guidance 22 2.3.1 Guidance in Visualisation 22 3 Problem Analysis 23 3.1 Problems of Ontology Visualisation Approaches 24 3.2 Research Questions 25 3.3 Set up of the Case Studies 25 3.3.1 Case Studies in the Life Sciences Domain 26 3.3.2 Case Studies in the Publishing Domain 26 3.3.3 Case Studies in the Software Technology Domain 27 3.4 Analysis of the Case Studies’ Ontologies 27 3.5 Manual Sketching of Graphics 29 3.6 Analysis of the Graphics for Typical Visualisation Cases 29 3.7 Requirements 33 3.7.1 Requirements for Visualisation and Interaction 34 3.7.2 Requirements for Data Awareness 34 3.7.3 Requirements for Reuse and Composition 34 3.7.4 Requirements for Variability 35 3.7.5 Requirements for Tooling Support and Guidance 35 3.7.6 Optional Features and Limitations 36 4 Analysis of the State of the Art 37 4.1 Related Visualisation Approaches 38 4.1.1 Short Overview of the Approaches 38 4.1.2 Detailed Comparison by Criteria 46 4.1.3 Conclusion – What Is Still Missing? 60 4.2 Visualisation Languages 62 4.2.1 Short Overview of the Compared Languages 62 4.2.2 Detailed Comparison by Language Criteria 66 4.2.3 Conclusion – What Is Still Missing? 71 4.3 RDF Presentation Languages 72 4.3.1 Short Overview of the Compared Languages 72 4.3.2 Detailed Comparison by Language Criteria 76 4.3.3 Additional Criteria for RDF Display Languages 87 4.3.4 Conclusion – What Is Still Missing? 89 4.4 Model-Driven Interfaces 90 4.4.1 Metamodel-Driven Interfaces 90 4.4.2 Ontology-Driven Interfaces 92 4.4.3 Combined Usage of the Metamodelling and Ontology Technological Space 94 5 A Visualisation Ontology – VISO 97 5.1 Methodology Used for Ontology Creation 100 5.2 Requirements for a Visualisation Ontology 100 5.3 Existing Approaches to Modelling in the Field of Visualisation 101 5.3.1 Terminologies and Taxonomies 101 5.3.2 Existing Visualisation Ontologies 102 5.3.3 Other Visualisation Models and Approaches to Formalisation 103 5.3.4 Summary 103 5.4 Technical Aspects of VISO 103 5.5 VISO/graphic Module – Graphic Vocabulary 104 5.5.1 Graphic Representations and Graphic Objects 105 5.5.2 Graphic Relations and Syntactic Structures 107 5.6 VISO/data Module – Characterising Data 110 5.6.1 Data Structure and Characteristics of Relations 110 5.6.2 The Scale of Measurement and Units 112 5.6.3 Properties for Characterising Data Variables in Statistical Data 113 5.7 VISO/facts Module – Facts for Vis. Constraints and Rules 115 5.7.1 Expressiveness of Graphic Relations 116 5.7.2 Effectiveness Ranking of Graphic Relations 118 5.7.3 Rules for Composing Graphics 119 5.7.4 Other Rules to Consider for Visual Mapping 124 5.7.5 Providing Named Value Collections 124 5.7.6 Existing Approaches to the Formalisation of Visualisation Knowledge . . 126 5.7.7 The VISO/facts/empiric Example Knowledge Base 126 5.8 Other VISO Modules 126 5.9 Conclusions and Future Work 127 5.10 Further Use Cases for VISO 127 5.11 VISO on the Web – Sharing the Vocabulary to Build a Community 128 6 A VISO-Based Abstract Visual Model – AVM 129 6.1 Graphical Notation Used in this Chapter 129 6.2 Elementary Graphic Objects and Graphic Attributes 131 6.3 N-Ary Relations 131 6.4 Binary Relations 131 6.5 Composition of Graphic Objects Using Roles 132 6.6 Composition of Graphic Relations Using Roles 132 6.7 Composition of Visual Mappings Using the AVM 135 6.8 Tracing 135 6.9 Is it Worth Having an Abstract Visual Model? 135 6.10 Discussion of Fresnel as a Related Language 137 6.11 Related Work 139 6.12 Limitations 139 6.13 Conclusions 140 7 A Language for RDFS/OWL Visualisation – RVL 141 7.1 Language Requirements 142 7.2 Main RVL Constructs 145 7.2.1 Mapping 145 7.2.2 Property Mapping 146 7.2.3 Identity Mapping 146 7.2.4 Value Mapping 147 7.2.5 Inheriting RVL Settings 147 7.2.6 Resource Mapping 148 7.2.7 Simplifications 149 7.3 Calculating Value Mappings 150 7.4 Defining Scale of Measurement 153 7.4.1 Determining the Scale of Measurement 154 7.5 Addressing Values in Value Mappings 156 7.5.1 Determining the Set of Addressed Source Values 156 7.5.2 Determining the Set of Addressed Target Values 157 7.6 Overlapping Value Mappings 158 7.7 Default Value Mapping 158 7.8 Default Labelling 159 7.9 Defining Interaction 159 7.10 Mapping Composition and Submappings 160 7.11 A Schema Language for RVL 160 7.11.1 Concrete Examples of the RVL Schema 163 7.12 Conclusions and Future Work 166 8 The OGVIC Approach 169 8.1 Ontology-Driven, Guided Editing of Visual Mappings 172 8.1.1 Classification of Constraints 172 8.1.2 Levels of Guidance 173 8.1.3 Implementing Constraint-Based Guidance 173 8.2 Support of Explicit and Composable Visual Mappings 177 8.2.1 Mapping Composition Cases 178 8.2.2 Selecting a Context 180 8.2.3 Using the Same Graphic Relation Multiple Times 181 8.3 Prototype P1 (TopBraid-Composer-based) 182 8.4 Prototype P2 (OntoWiki-based) 184 8.5 Prototype P3 (Java Implementation of RVL) 187 8.6 Lessons Learned from Prototypes & Future Work 190 8.6.1 Checking RVL Constraints and Visualisation Rules 190 8.6.2 A User Interface for Editing RVL Mappings 190 8.6.3 Graph Transformations with SPIN and SPARQL 1.1 Update 192 8.6.4 Selection and Filtering of Data 193 8.6.5 Interactivity and Incremental Processing 193 8.6.6 Rendering the Final Platform-Specific Code 196 9 Application 197 9.1 Coverage of Case Study Sketches and Necessary Features 198 9.2 Coverage of Visualisation Cases 201 9.3 Coverage of Requirements 205 9.4 Full Example 206 10 Conclusions 211 10.1 Contributions 211 10.2 Constructive Evaluation 212 10.3 Research Questions 213 10.4 Transfer to Other Models and Constraint Languages 213 10.5 Limitations 214 10.6 Future Work 214 Appendices 217 A Case Study Sketches 219 B VISO – Comparison of Visualisation Literature 229 C RVL 231 D RVL Example Mappings and Application 233 D.1 Listings of RVL Example Mappings as Required by Prototype P3 233 D.2 Features Required for Implementing all Sketches 235 D.3 JSON Format for Processing the AVM with D3 – Hierarchical Variant 238 Bibliography 238 List of Figures 251 List of Tables 254 List of Listings 257
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography