To see the other types of publications on this topic, follow the link: Fusion Management System.

Dissertations / Theses on the topic 'Fusion Management System'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 26 dissertations / theses for your research on the topic 'Fusion Management System.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Deaves, R. H. "The management of communications in decentralised Bayesian data fusion system." Thesis, University of Bristol, 1998. http://hdl.handle.net/1983/ae6cb4d5-96e8-4af4-90c0-fddb4f188369.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Phiri, Jackson. "A digital identity management system." Thesis, UWC, 2007. http://hdl.handle.net/11394/2871.

Full text
Abstract:
>Magister Scientiae - MSc
The recent years have seen an increase in the number of users accessing online services using communication devices such as computers, mobile phones and cards based credentials such as credit cards. This has prompted most governments and business organizations to change the way they do business and manage their identity information. The coming of the online services has however made most Internet users vulnerable to identity fraud and theft. This has resulted in a subsequent increase in the number of reported cases of identity theft and fraud, which is on the increase and costing the global industry excessive amounts. Today with more powerful and effective technologies such as artificial intelligence, wireless communication, mobile storage devices and biometrics, it should be possible to come up with a more effective multi-modal authentication system to help reduce the cases of identity fraud and theft. A multi-modal digital identity management system is proposed as a solution for managing digital identity information in an effort to reduce the cases of identity fraud and theft seen on most online services today. The proposed system thus uses technologies such as artificial intelligence and biometrics on the current unsecured networks to maintain the security and privacy of users and service providers in a transparent, reliable and efficient way. In order to be authenticated in the proposed multi-modal authentication system, a user is required to submit more than one credential attribute. An artificial intelligent technology is used to implement a technique of information fusion to combine the user’s credential attributes for optimum recognition. The information fusion engine is then used to implement the required multi-modal authentication system.
APA, Harvard, Vancouver, ISO, and other styles
3

Mirza, Atif R. "An architectural selection framework for data fusion in sensor platforms." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/42369.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, System Design and Management Program, February 2007.
Includes bibliographical references (leaves 97-100).
The role of data fusion in sensor platforms is becoming increasingly important in various domains of science, technology and business. Fusion pertains to the merging or integration of information towards an enhanced level of awareness. This thesis provides a canonical overview of several major fusion architectures developed from the remote sensing and defense community. Additionally, it provides an assessment of current sensors and their platforms, the influence of reliability measures, and the connection to fusion applications. We present several types of architecture for managing multi-sensor data fusion, specifically as they relate to the tracking-correlation function and blackboard processing representations in knowledge engineering. Object-Process Methods are used to model the information fusion process and supporting systems. Several mathematical techniques are shown to be useful in the fusion of numerical properties, sensor data updating and the implementation of unique detection probabilities. Finally, we discuss the importance of fusion to the concept and operation of the Semantic Web, which promises new ways to exploit the synergy of multi-sensor data platforms. This requires the synthesis of fusion with ontology models for knowledge representation. We discuss the importance of fusion as a reuse process in ontological engineering, and review key lifecycle models in ontology development. The evolutionary approach to ontology development is considered the most useful and adaptable to the complexities of semantic networks. Several potential applications for data fusion are screened and ranked according to the Joint Directors of Laboratories (JDL) process model for information fusion. Based on these predetermined criteria, the case of medical diagnostic imaging was found to offer the most promising applications for fusion, on which future product platforms can be built.
by Atif R. Mirza.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
4

Hu, Xi. "Network and sensor management for mulitiple sensor emitter location system." Diss., Online access via UMI:, 2008.

Find full text
Abstract:
Thesis (Ph. D.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Electrical and Computer Engineering, 2008.
Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
5

Odabasi, Mehmet. "User Acceptance of North Central Texas Fusion Center System by Law Enforcement Officers." Thesis, University of North Texas, 2010. https://digital.library.unt.edu/ark:/67531/metadc33191/.

Full text
Abstract:
The September 11 terrorist attacks pointed out the lack of information sharing between law enforcement agencies as a potential threat to sound law enforcement in the United States. Therefore, many law enforcement agencies as well as the federal government have been initiating information sharing systems among law enforcement agencies to eradicate the information sharing problem. One of the systems established by Homeland Security is the North Central Texas Fusion Center (NCTFC). This study evaluates the NCTFC by utilizing user acceptance methodology. The unified theory of acceptance and the use of technology is used as a theoretical framework for this study. Within the study, user acceptance literature is examined and various models and theories are discussed. Furthermore, a brief information regarding the intelligence work done by law enforcement agencies are explained. In addition to the NCTFC, several major law enforcement information systems are introduced. The data for this study comes from the users of the NCTFC across the north central Texas region. Surveys and interviews are used to triangulate data. It is found in this study that performance expectancy and effort expectancy are important indicators of system use. Furthermore, outreach and needs assessment are important factors in establishing systems. The results of the study offer valuable input for NCTFC administrators, law enforcement officials, and future researchers.
APA, Harvard, Vancouver, ISO, and other styles
6

Cook, Brandon M. "An Intelligent System for Small Unmanned Aerial Vehicle Traffic Management." University of Cincinnati / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1617106257481515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Johansson, Ronnie. "Large-Scale Information Acquisition for Data and Information Fusion." Doctoral thesis, Stockholm, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3890.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Uchytil, Joseph. "Assessing the operational value of situational awareness for AEGIS and Ship Self Defense System (SSDS) platforms through the application of the Knowledge Value Added (KVA) methodology." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2006. http://library.nps.navy.mil/uhtbin/hyperion/06Jun%5FUchytil.pdf.

Full text
Abstract:
Thesis (M.S. in Information Technology Management)--Naval Postgraduate School, June 2006.
Thesis Advisor(s): Thomas J. Housel. "June 2006." Includes bibliographical references (p. 71-72). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
9

Hill, Elisa Marie. "The use of a novel treatment protocol based off acceptance and commitment therapy for the problematic behaviors of two high functioning children with autism." OpenSIUC, 2013. https://opensiuc.lib.siu.edu/theses/1342.

Full text
Abstract:
The present study extends previous research on acceptance and commitment therapy (ACT) by using a new ACT protocol, the Fusion Management System (Dixon, 2013 in press) with a new population. Participants were 2 high functioning children with autism: an 8-year old boy and a 12 year-old girl. Using a multiple baseline design, 2 participants were exposed to 15 hours of therapy over the span of 12 weeks. Behavioral data was taken for each participant by their parent(s) on a problematic behavior that was of concern to them. Prior to and following the ACT intervention, both participants took three psychometric measures that were designed to measure ACT related processes: the Child and Adolescent Mindfulness Measure (CAMM), the Acceptance and Action Questionnaire-II (AAQ-II), and the Avoidance and Fusion Questionnaire for Youth (AFQ-Y). Following the intervention one participant improved on all measures of the ACT related processes and the other participant's score improved on the AFQ-Y and slightly deteriorated on the CAMM and AAQ-II. During the intervention phase of this study both participants' problematic behavior significantly improved. Implications of the study and future research are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
10

Serce, Fatma Cemile. "A Multi-agent Adaptive Learning System For Distance Education." Phd thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12609220/index.pdf.

Full text
Abstract:
The adaptiveness provides uniquely identifying and monitoring the learner&rsquo
s learning activities according to his/her respective profile. The adaptive intelligent learning management systems (AILMS) help a wide range of students to achieve their learning goals effectively by delivering knowledge in an adaptive or individualized style through online learning settings. This study presents a multi-agent system, called MODA, developed to provide adaptiveness in learning management systems (LMS). A conceptual framework for adaptive learning systems is proposed for this purpose. The framework is based on the idea that adaptiveness is the best matching between the learner profile and the course content profile. The learning styles of learners and the content type of learning material are used to match the learner to the most suitable content. The thesis covers the pedagogical framework applied in MODA, the technical and multi-agent architectures of MODA, the TCP-IP based protocol providing communication between MODA and LMS, and a sample application of the system to an open source learning management system, OLAT. The study also discusses the possibilities of future interests.
APA, Harvard, Vancouver, ISO, and other styles
11

Zayrit, Karima. "Fusion de données imparfaites multi-sources : application à la spatialisation qualifiée des pratiques agricoles." Thesis, Reims, 2015. http://www.theses.fr/2015REIMS041/document.

Full text
Abstract:
Notre thèse s'inscrit dans le cadre de la mise en place d'un observatoire des pratiques agricoles dans le bassin versant de la Vesle. L'objectif de ce système d'information agri-environnemental est de comprendre les pratiques responsables de la pollution de la ressource en eau par les pesticides d'origine agricole sur le territoire étudié et de fournir des outils pertinents et pérennes pour estimer leurs impacts. Notre problématique concerne la prise en compte de l'imperfection dans le processus de la fusion de données multi-sources et imparfaites. En effet, l'information sur les pratiques n'est pas exhaustive et ne fait pas l'objet d'une déclaration, il nous faut donc construire cette connaissance par l'utilisation conjointe de sources multiples et de qualités diverses en intégrant dans le système d'information la gestion de l'information imparfaite. Dans ce contexte, nous proposons des méthodes pour une reconstruction spatialisée des informations liées aux pratiques agricoles à partir de la télédétection, du RPG, d'enquêtes terrain et de dires d'experts, reconstruction qualifiée par une évaluation de la qualité de l'information. Par ailleurs, nous proposons une modélisation conceptuelle des entités agronomiques imparfaites du système d'information en nous appuyant sur UML et PERCEPTORY. Nous proposons ainsi des modèles de représentation de l'information imparfaite issues des différentes sources d'information à l'aide soit des ensembles flous, soit de la théorie des fonctions de croyance et nous intégrons ces modèles dans le calcul d'indicateurs agri-environnementaux tels que l'IFT et le QSA
Our thesis is part of a regional project aiming the development of a community environmental information system for agricultural practices in the watershed of the Vesle. The objective of this observatory is 1) to understand the practices of responsible of the water resource pollution by pesticides from agriculture in the study area and 2) to provide relevant and sustainable tools to estimate their impacts. Our open issue deals with the consideration of imperfection in the process of merging multiple sources and imperfect data. Indeed, information on practices is not exhaustive and is not subject to return, so we need to build this knowledge through the combination of multiple sources and of varying quality by integrating imperfect information management information in the system. In this context, we propose methods for spatial reconstruction of information related to agricultural practices from the RPG remote sensing, field surveys and expert opinions, skilled reconstruction with an assessment of the quality of the information. Furthermore, we propose a conceptual modeling of agronomic entities' imperfect information system building on UML and PERCEPTORY.We provide tools and models of representation of imperfect information from the various sources of information using fuzzy sets and the belief function theory and integrate these models into the computation of agri-environmental indicators such as TFI and ASQ
APA, Harvard, Vancouver, ISO, and other styles
12

Utete, Simukai. "Network management in decentralised sensing systems." Thesis, University of Oxford, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.297308.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Johansson, Ronnie. "Information Acquisition in Data Fusion Systems." Licentiate thesis, KTH, Numerical Analysis and Computer Science, NADA, 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-1673.

Full text
Abstract:

By purposefully utilising sensors, for instance by a datafusion system, the state of some system-relevant environmentmight be adequately assessed to support decision-making. Theever increasing access to sensors o.ers great opportunities,but alsoincurs grave challenges. As a result of managingmultiple sensors one can, e.g., expect to achieve a morecomprehensive, resolved, certain and more frequently updatedassessment of the environment than would be possible otherwise.Challenges include data association, treatment of con.ictinginformation and strategies for sensor coordination.

We use the term information acquisition to denote the skillof a data fusion system to actively acquire information. Theaim of this thesis is to instructively situate that skill in ageneral context, explore and classify related research, andhighlight key issues and possible future work. It is our hopethat this thesis will facilitate communication, understandingand future e.orts for information acquisition.

The previously mentioned trend towards utilisation of largesets of sensors makes us especially interested in large-scaleinformation acquisition, i.e., acquisition using many andpossibly spatially distributed and heterogeneous sensors.

Information acquisition is a general concept that emerges inmany di.erent .elds of research. In this thesis, we surveyliterature from, e.g., agent theory, robotics and sensormanagement. We, furthermore, suggest a taxonomy of theliterature that highlights relevant aspects of informationacquisition.

We describe a function, perception management (akin tosensor management), which realizes information acquisition inthe data fusion process and pertinent properties of itsexternal stimuli, sensing resources, and systemenvironment.

An example of perception management is also presented. Thetask is that of managing a set of mobile sensors that jointlytrack some mobile targets. The game theoretic algorithmsuggested for distributing the targets among the sensors proveto be more robust to sensor failure than a measurement accuracyoptimal reference algorithm.

Keywords:information acquisition, sensor management,resource management, information fusion, data fusion,perception management, game theory, target tracking

APA, Harvard, Vancouver, ISO, and other styles
14

Roquel, Arnaud. "Exploitation du conflit entre capteurs pour la gestion d'un système complexe multi-capteurs." Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00804661.

Full text
Abstract:
Les systèmes complexes intègrent aujourd'hui de nombreux capteurs physiques ou logiques qui facilitent la prise de décisions optimales en fonction de l'exosystème et de l'endosystème. Ces capteurs sont des sources de données, qui délivrent des informations partielles, imprécises et/ou incertaines, partiellement complémentaires et partiellement redondantes. La théorie des fonctions de croyances offre un cadre formel adapté à la représentation de l'imprécision et de l'incertitude des informations. Cependant, même en modélisant l'ignorance et l'imprécision des sources, l'absence de conflit entre les sources n'est toutefois pas garantie.Dans la théorie des fonctions de croyances, le désaccord entre sources est classiquement mesuré en termes de conflit 'Dempsterien', celui résultant de la combinaison conjonctive des sources, ou de dis-similarité ou distances entre fonctions de croyance. Toutes ces mesures sont globales, et ne donnent pas d'information directe sur la source du conflit.La contribution principale de cette thèse est de décomposer le conflit Dempsterien dans le but d'analyser celui-ci. Nous proposons une décomposition par rapport aux différentes hypothèses simples ou composées, issues de l'espace de discernement. Nous montrons l'unicité de cette décomposition et explicitons l'algorithme de calcul, à partir de la décomposition canonique de la fonction de croyance. Nous interprétons alors chacun des termes de la décomposition comme la contribution au conflit global, apportée par chaque hypothèse simple ou composée. Cette décomposition s'applique à l'analyse du confit intra-source (i.e. le conflit inhérent à la source) ou du conflit inter-sources (i.e. le conflit qui apparait lors de la fusion des sources). Nous illustrons sur des exemples jouets comment l'observation de la répartition du conflit par rapport aux différentes hypothèses peut permettre l'identification de l'origine de certains conflits. Trois applications de notre mesure sont ensuite développées, afin d'illustrer son utilité.La première application concerne la détection préventive de chute un véhicule type bicycle (moto). Les sources de données sont les accélérations mesurées sur les deux roues. Un conflit entre ces mesures, supposées hautement redondantes, voire corrélées, sera alors interprété comme un début de chute. Nous montrons que la décomposition du conflit fournit un indicateur de chute plus fin et précoce que la mesure du conflit Dempsterien.La deuxième application concerne la localisation de véhicule, problème essentiel pour l'autonomie des véhicules d'exploration comme des robots de service. Les sources sont des sorties d'algorithmes d'estimation de mouvement du véhicule. Nous montrons d'abord qu'estimer dynamiquement la fiabilité des sources permet d'améliorer la fusion. Nous montrons ensuite que la décomposition du conflit permet une mesure plus fine de la fiabilité de la fusion que la mesure du conflit Dempsterien. En cas de conflit détecté, l'estimation de la fiabilité de chaque source est ensuite fondée sur la vérification (ou non) d'une hypothèse de régularité temporelle, vérification elle-même basée sur une mesure de distance locale aux hypothèses simples ou composées. La troisième application propose une généralisation de la combinaison hybride de Dubois Prade au cas de la combinaison à N sources. Notre mesure calculant le conflit partiel associé à chaque sous-ensemble d'hypothèses, en nous inspirant du principe de la règle de combinaison hybride, nous redistribuons la masse de ce conflit partiel à la disjonction des hypothèses du sous-ensemble. La décomposition du conflit permet d'identifier de manière unique les différents sous-ensembles d'hypothèses contribuant au conflit.En conclusion, les travaux ont montré que l'information issue de la mesure du conflit, et de sa décomposition, pouvait (devait) être considérée comme une information à part entière, permettant notamment la gestion des sources et des croyances à fusionner.
APA, Harvard, Vancouver, ISO, and other styles
15

Jiao, Lianmeng. "Classification of uncertain data in the framework of belief functions : nearest-neighbor-based and rule-based approaches." Thesis, Compiègne, 2015. http://www.theses.fr/2015COMP2222/document.

Full text
Abstract:
Dans de nombreux problèmes de classification, les données sont intrinsèquement incertaines. Les données d’apprentissage disponibles peuvent être imprécises, incomplètes, ou même peu fiables. En outre, des connaissances spécialisées partielles qui caractérisent le problème de classification peuvent également être disponibles. Ces différents types d’incertitude posent de grands défis pour la conception de classifieurs. La théorie des fonctions de croyance fournit un cadre rigoureux et élégant pour la représentation et la combinaison d’une grande variété d’informations incertaines. Dans cette thèse, nous utilisons cette théorie pour résoudre les problèmes de classification des données incertaines sur la base de deux approches courantes, à savoir, la méthode des k plus proches voisins (kNN) et la méthode à base de règles.Pour la méthode kNN, une préoccupation est que les données d’apprentissage imprécises dans les régions où les classes de chevauchent peuvent affecter ses performances de manière importante. Une méthode d’édition a été développée dans le cadre de la théorie des fonctions de croyance pour modéliser l’information imprécise apportée par les échantillons dans les régions qui se chevauchent. Une autre considération est que, parfois, seul un ensemble de données d’apprentissage incomplet est disponible, auquel cas les performances de la méthode kNN se dégradent considérablement. Motivé par ce problème, nous avons développé une méthode de fusion efficace pour combiner un ensemble de classifieurs kNN couplés utilisant des métriques couplées apprises localement. Pour la méthode à base de règles, afin d’améliorer sa performance dans les applications complexes, nous étendons la méthode traditionnelle dans le cadre des fonctions de croyance. Nous développons un système de classification fondé sur des règles de croyance pour traiter des informations incertains dans les problèmes de classification complexes. En outre, dans certaines applications, en plus de données d’apprentissage, des connaissances expertes peuvent également être disponibles. Nous avons donc développé un système de classification hybride fondé sur des règles de croyance permettant d’utiliser ces deux types d’information pour la classification
In many classification problems, data are inherently uncertain. The available training data might be imprecise, incomplete, even unreliable. Besides, partial expert knowledge characterizing the classification problem may also be available. These different types of uncertainty bring great challenges to classifier design. The theory of belief functions provides a well-founded and elegant framework to represent and combine a large variety of uncertain information. In this thesis, we use this theory to address the uncertain data classification problems based on two popular approaches, i.e., the k-nearest neighbor rule (kNN) andrule-based classification systems. For the kNN rule, one concern is that the imprecise training data in class over lapping regions may greatly affect its performance. An evidential editing version of the kNNrule was developed based on the theory of belief functions in order to well model the imprecise information for those samples in over lapping regions. Another consideration is that, sometimes, only an incomplete training data set is available, in which case the ideal behaviors of the kNN rule degrade dramatically. Motivated by this problem, we designedan evidential fusion scheme for combining a group of pairwise kNN classifiers developed based on locally learned pairwise distance metrics.For rule-based classification systems, in order to improving their performance in complex applications, we extended the traditional fuzzy rule-based classification system in the framework of belief functions and develop a belief rule-based classification system to address uncertain information in complex classification problems. Further, considering that in some applications, apart from training data collected by sensors, partial expert knowledge can also be available, a hybrid belief rule-based classification system was developed to make use of these two types of information jointly for classification
APA, Harvard, Vancouver, ISO, and other styles
16

McConky, Katie Theresa. "Design and analysis of information fusion, dynamic sensor management rules for cyber security systems using simulation /." Online version of thesis, 2007. http://hdl.handle.net/1850/4895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Bowden, Todd H. "Design and Development of an Electronic Performance Enhancement Tool for Creating and Maintaining Information Management Web Sites." Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/77315.

Full text
Abstract:
This study explored the design and development of an electronic performance enhancement tool that can assist a person with limited programming skills to create a variety of simple customized information management websites. In particular, this study was modeled after needs within an Instruction Technology department in which individuals were able to create pre-functional web pages with various elements such as textboxes and dropdown menus but lacked the programming skills necessary to add functionality to these web forms. Skilled programmers could add functionality to these pre-functioning web forms or create customized information management websites from scratch. However, programmers are not always available when needed. At the time of this study, there was no readily available way for persons to create customized information management websites without the services of a programmer or without needing to learn programming skills themselves. This study sought to determine what functionalities, characteristics and capabilities could be included in an electronic performance enhancement tool to assist non-programmers to create simple customized information management websites and how a tool with such functionalities, characteristics and capabilities could be designed and developed. A prototype version of such tool (named the Form And DataBase Interaction Tool or "FADBIT") was designed and developed in this study. This tool asks users who have created simple pre-functional web forms to answer a series of questions related to those webforms. Given the user's responses to these questions, this tool is able to form a metalanguage representation of the user's intentions for the web form and can translate this representation into useful programming code to add the desired functionality. The tool was successfully designed and developed using a generalized modular framework, and a Create-Adapt-Generalize model, with each module addressing one or more patterns common to web programming. The prototype tool successfully allowed non-programmers to create functional information websites for two structured evaluation projects, and achieved some level of success and encountered some difficulties with an unstructured project. Proposed modifications and extensions to the tool to address the difficulties encountered are presented.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
18

Bjarnolf, Philip. "Threat Analysis Using Goal-Oriented Action Planning : Planning in the Light of Information Fusion." Thesis, University of Skövde, School of Humanities and Informatics, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-1108.

Full text
Abstract:

An entity capable of assessing its and others action capabilities possess the power to predict how the involved entities may change their world. Through this knowledge and higher level of situation awareness, the assessing entity may choose the actions that have the most suitable effect, resulting in that entity’s desired world state.

This thesis covers aspects and concepts of an arbitrary planning system and presents a threat analyzer architecture built on the novel planning system Goal-Oriented Action Planning (GOAP). This planning system has been suggested for an application for improved missile route planning and targeting, as well as being applied in contemporary computer games such as F.E.A.R. – First Encounter Assault Recon and S.T.A.L.K.E.R.: Shadow of Chernobyl. The GOAP architecture realized in this project is utilized by two agents that perform action planning to reach their desired world states. One of the agents employs a modified GOAP planner used as a threat analyzer in order to determine what threat level the adversary agent constitutes. This project does also introduce a conceptual schema of a general planning system that considers orders, doctrine and style; as well as a schema depicting an agent system using a blackboard in conjunction with the OODA-loop.

APA, Harvard, Vancouver, ISO, and other styles
19

Sutor, S. R. (Stephan R. ). "Large-scale high-performance video surveillance." Doctoral thesis, Oulun yliopisto, 2014. http://urn.fi/urn:isbn:9789526205618.

Full text
Abstract:
Abstract The last decade was marked by a set of harmful events ranging from economical crises to organized crime, acts of terror and natural catastrophes. This has led to a paradigm transformation concerning security. Millions of surveillance cameras have been deployed, which led to new challenges, as the systems and operations behind those cameras could not cope with the rapid growth in number of video cameras and systems. Looking at today’s control rooms, often hundreds or even thousands of cameras are displayed, overloading security officers with irrelevant information. The purpose of this research was the creation of a novel video surveillance system with automated analysis mechanisms which enable security authorities and their operators to cope with this information flood. By automating the process, video surveillance was transformed into a proactive information system. The progress in technology as well as the ever increasing demand in security have proven to be an enormous driver for security technology research, such as this study. This work shall contribute to the protection of our personal freedom, our lives, our property and our society by aiding the prevention of crime and terrorist attacks that diminish our personal freedom. In this study, design science research methodology was utilized in order to ensure scientific rigor while constructing and evaluating artifacts. The requirements for this research were sought in close cooperation with high-level security authorities and prior research was studied in detail. The created construct, the “Intelligent Video Surveillance System”, is a distributed, highly-scalable software framework, that can function as a basis for any kind of high-performance video surveillance system, from installations focusing on high-availability to flexible cloud-based installation that scale across multiple locations and tens of thousands of cameras. First, in order to provide a strong foundation, a modular, distributed system architecture was created, which was then augmented by a multi-sensor analysis process. Thus, the analysis of data from multiple sources, combining video and other sensors in order to automatically detect critical events, was enabled. Further, an intelligent mobile client, the video surveillance local control, which addressed remote access applications, was created. Finally, a wireless self-contained surveillance system was introduced, a novel smart camera concept that enabled ad hoc and mobile surveillance. The value of the created artifacts was proven by evaluation at two real-world sites: An international airport, which has a large-scale installation with high-security requirements, and a security service provider, offering a multitude of video-based services by operating a video control center with thousands of cameras connected
Tiivistelmä Viime vuosikymmen tunnetaan vahingollisista tapahtumista alkaen talouskriiseistä ja ulottuen järjestelmälliseen rikollisuuteen, terrori-iskuihin ja luonnonkatastrofeihin. Tämä tilanne on muuttanut suhtautumista turvallisuuteen. Miljoonia valvontakameroita on otettu käyttöön, mikä on johtanut uusiin haasteisiin, koska kameroihin liittyvät järjestelmät ja toiminnot eivät pysty toimimaan yhdessä lukuisien uusien videokameroiden ja järjestelmien kanssa. Nykyajan valvontahuoneissa voidaan nähdä satojen tai tuhansien kameroiden tuottavan kuvaa ja samalla runsaasti tarpeetonta informaatiota turvallisuusvirkailijoiden katsottavaksi. Tämän tutkimuksen tarkoitus oli luoda uusi videovalvontajärjestelmä, jossa on automaattiset analyysimekanismit, jotka mahdollistavat turva-alan toimijoiden ja niiden operaattoreiden suoriutuvan informaatiotulvasta. Automaattisen videovalvontaprosessin avulla videovalvonta muokattiin proaktiiviseksi tietojärjestelmäksi. Teknologian kehitys ja kasvanut turvallisuusvaatimus osoittautuivat olevan merkittävä ajuri turvallisuusteknologian tutkimukselle, kuten tämä tutkimus oli. Tämä tutkimus hyödyttää yksittäisen ihmisen henkilökohtaista vapautta, elämää ja omaisuutta sekä yhteisöä estämällä rikoksia ja terroristihyökkäyksiä. Tässä tutkimuksessa suunnittelutiedettä sovellettiin varmistamaan tieteellinen kurinalaisuus, kun artefakteja luotiin ja arvioitiin. Tutkimuksen vaatimukset perustuivat läheiseen yhteistyöhön korkeatasoisten turva-alan viranomaisten kanssa, ja lisäksi aiempi tutkimus analysoitiin yksityiskohtaisesti. Luotu artefakti - ’älykäs videovalvontajärjestelmä’ - on hajautettu, skaalautuva ohjelmistoviitekehys, joka voi toimia perustana monenlaiselle huipputehokkaalle videovalvontajärjestelmälle alkaen toteutuksista, jotka keskittyvät saatavuuteen, ja päättyen joustaviin pilviperustaisiin toteutuksiin, jotka skaalautuvat useisiin sijainteihin ja kymmeniin tuhansiin kameroihin. Järjestelmän tukevaksi perustaksi luotiin hajautettu järjestelmäarkkitehtuuri, jota laajennettiin monisensorianalyysiprosessilla. Siten mahdollistettiin monista lähteistä peräisin olevan datan analysointi, videokuvan ja muiden sensorien datan yhdistäminen ja automaattinen kriittisten tapahtumien tunnistaminen. Lisäksi tässä työssä luotiin älykäs kännykkäsovellus, videovalvonnan paikallinen kontrolloija, joka ohjaa sovelluksen etäkäyttöä. Viimeksi tuotettiin langaton itsenäinen valvontajärjestelmä – uudenlainen älykäs kamerakonsepti – joka mahdollistaa ad hoc -tyyppisen ja mobiilin valvonnan. Luotujen artefaktien arvo voitiin todentaa arvioimalla ne kahdessa reaalimaailman ympäristössä: kansainvälinen lentokenttä, jonka laajamittaisessa toteutuksessa on korkeat turvavaatimukset, ja turvallisuuspalveluntuottaja, joka tarjoaa moninaisia videopohjaisia palveluja videovalvontakeskuksen avulla käyttäen tuhansia kameroita
APA, Harvard, Vancouver, ISO, and other styles
20

Nasman, James M. "Deployed virtual consulting : the fusion of wearable computing, collaborative technology, augmented reality and intelligent agents to support fleet aviation maintenance /." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Mar%5FNasman.pdf.

Full text
Abstract:
Thesis (M.S. in Information Technology Management)--Naval Postgraduate School, March 2004.
Thesis advisor(s): Alex Bordetsky, Gurminder Singh. Includes bibliographical references (p. 49). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
21

Plachkov, Alex. "Soft Data-Augmented Risk Assessment and Automated Course of Action Generation for Maritime Situational Awareness." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/35336.

Full text
Abstract:
This thesis presents a framework capable of integrating hard (physics-based) and soft (people-generated) data for the purpose of achieving increased situational assessment (SA) and effective course of action (CoA) generation upon risk identification. The proposed methodology is realized through the extension of an existing Risk Management Framework (RMF). In this work, the RMF’s SA capabilities are augmented via the injection of soft data features into its risk modeling; the performance of these capabilities is evaluated via a newly-proposed risk-centric information fusion effectiveness metric. The framework’s CoA generation capabilities are also extended through the inclusion of people-generated data, capturing important subject matter expertise and providing mission-specific requirements. Furthermore, this work introduces a variety of CoA-related performance measures, used to assess the fitness of each individual potential CoA, as well as to quantify the overall chance of mission success improvement brought about by the inclusion of soft data. This conceptualization is validated via experimental analysis performed on a combination of real- world and synthetically-generated maritime scenarios. It is envisioned that the capabilities put forth herein will take part in a greater system, capable of ingesting and seamlessly integrating vast amounts of heterogeneous data, with the intent of providing accurate and timely situational updates, as well as assisting in operational decision making.
APA, Harvard, Vancouver, ISO, and other styles
22

Vestin, Albin, and Gustav Strandberg. "Evaluation of Target Tracking Using Multiple Sensors and Non-Causal Algorithms." Thesis, Linköpings universitet, Reglerteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-160020.

Full text
Abstract:
Today, the main research field for the automotive industry is to find solutions for active safety. In order to perceive the surrounding environment, tracking nearby traffic objects plays an important role. Validation of the tracking performance is often done in staged traffic scenarios, where additional sensors, mounted on the vehicles, are used to obtain their true positions and velocities. The difficulty of evaluating the tracking performance complicates its development. An alternative approach studied in this thesis, is to record sequences and use non-causal algorithms, such as smoothing, instead of filtering to estimate the true target states. With this method, validation data for online, causal, target tracking algorithms can be obtained for all traffic scenarios without the need of extra sensors. We investigate how non-causal algorithms affects the target tracking performance using multiple sensors and dynamic models of different complexity. This is done to evaluate real-time methods against estimates obtained from non-causal filtering. Two different measurement units, a monocular camera and a LIDAR sensor, and two dynamic models are evaluated and compared using both causal and non-causal methods. The system is tested in two single object scenarios where ground truth is available and in three multi object scenarios without ground truth. Results from the two single object scenarios shows that tracking using only a monocular camera performs poorly since it is unable to measure the distance to objects. Here, a complementary LIDAR sensor improves the tracking performance significantly. The dynamic models are shown to have a small impact on the tracking performance, while the non-causal application gives a distinct improvement when tracking objects at large distances. Since the sequence can be reversed, the non-causal estimates are propagated from more certain states when the target is closer to the ego vehicle. For multiple object tracking, we find that correct associations between measurements and tracks are crucial for improving the tracking performance with non-causal algorithms.
APA, Harvard, Vancouver, ISO, and other styles
23

"Modeling Supply Chain Dynamics with Calibrated Simulation Using Data Fusion." Doctoral diss., 2010. http://hdl.handle.net/2286/R.I.8609.

Full text
Abstract:
abstract: In today's global market, companies are facing unprecedented levels of uncertainties in supply, demand and in the economic environment. A critical issue for companies to survive increasing competition is to monitor the changing business environment and manage disturbances and changes in real time. In this dissertation, an integrated framework is proposed using simulation and online calibration methods to enable the adaptive management of large-scale complex supply chain systems. The design, implementation and verification of the integrated approach are studied in this dissertation. The research contributions are two-fold. First, this work enriches symbiotic simulation methodology by proposing a framework of simulation and advanced data fusion methods to improve simulation accuracy. Data fusion techniques optimally calibrate the simulation state/parameters by considering errors in both the simulation models and in measurements of the real-world system. Data fusion methods - Kalman Filtering, Extended Kalman Filtering, and Ensemble Kalman Filtering - are examined and discussed under varied conditions of system chaotic levels, data quality and data availability. Second, the proposed framework is developed, validated and demonstrated in `proof-of-concept' case studies on representative supply chain problems. In the case study of a simplified supply chain system, Kalman Filtering is applied to fuse simulation data and emulation data to effectively improve the accuracy of the detection of abnormalities. In the case study of the `beer game' supply chain model, the system's chaotic level is identified as a key factor to influence simulation performance and the choice of data fusion method. Ensemble Kalman Filtering is found more robust than Extended Kalman Filtering in a highly chaotic system. With appropriate tuning, the improvement of simulation accuracy is up to 80% in a chaotic system, and 60% in a stable system. In the last study, the integrated framework is applied to adaptive inventory control of a multi-echelon supply chain with non-stationary demand. It is worth pointing out that the framework proposed in this dissertation is not only useful in supply chain management, but also suitable to model other complex dynamic systems, such as healthcare delivery systems and energy consumption networks.
Dissertation/Thesis
Ph.D. Engineering 2010
APA, Harvard, Vancouver, ISO, and other styles
24

Jesneck, JL, LW Nolte, JA Baker, CE Floyd, and JY Lo. "Optimized approach to decision fusion of heterogeneous data for breast cancer diagnosis." Thesis, 2006. http://hdl.handle.net/10161/207.

Full text
Abstract:
As more diagnostic testing options become available to physicians, it becomes more difficult to combine various types of medical information together in order to optimize the overall diagnosis. To improve diagnostic performance, here we introduce an approach to optimize a decision-fusion technique to combine heterogeneous information, such as from different modalities, feature categories, or institutions. For classifier comparison we used two performance metrics: The receiving operator characteristic (ROC) area under the curve [area under the ROC curve (AUC)] and the normalized partial area under the curve (pAUC). This study used four classifiers: Linear discriminant analysis (LDA), artificial neural network (ANN), and two variants of our decision-fusion technique, AUC-optimized (DF-A) and pAUC-optimized (DF-P) decision fusion. We applied each of these classifiers with 100-fold cross-validation to two heterogeneous breast cancer data sets: One of mass lesion features and a much more challenging one of microcalcification lesion features. For the calcification data set, DF-A outperformed the other classifiers in terms of AUC (p < 0.02) and achieved AUC=0.85 +/- 0.01. The DF-P surpassed the other classifiers in terms of pAUC (p < 0.01) and reached pAUC=0.38 +/- 0.02. For the mass data set, DF-A outperformed both the ANN and the LDA (p < 0.04) and achieved AUC=0.94 +/- 0.01. Although for this data set there were no statistically significant differences among the classifiers' pAUC values (pAUC=0.57 +/- 0.07 to 0.67 +/- 0.05, p > 0.10), the DF-P did significantly improve specificity versus the LDA at both 98% and 100% sensitivity (p < 0.04). In conclusion, decision fusion directly optimized clinically significant performance measures, such as AUC and pAUC, and sometimes outperformed two well-known machine-learning techniques when applied to two different breast cancer data sets.
Dissertation
APA, Harvard, Vancouver, ISO, and other styles
25

Banisakher, Mubarak. "A Human-Centric Approach to Data Fusion in Post-Disaster Managment: The Development of a Fuzzy Set Theory Based Model." Doctoral diss., 2014. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/6055.

Full text
Abstract:
It is critical to provide an efficient and accurate information system in the post-disaster phase for individuals' in order to access and obtain the necessary resources in a timely manner; but current map based post-disaster management systems provide all emergency resource lists without filtering them which usually leads to high levels of energy consumed in calculation. Also an effective post-disaster management system (PDMS) will result in distribution of all emergency resources such as, hospital, storage and transportation much more reasonably and be more beneficial to the individuals in the post disaster period. In this Dissertation, firstly, semi-supervised learning (SSL) based graph systems was constructed for PDMS. A Graph-based PDMS' resource map was converted to a directed graph that presented by adjacent matrix and then the decision information will be conducted from the PDMS by two ways, one is clustering operation, and another is graph-based semi-supervised optimization process. In this study, PDMS was applied for emergency resource distribution in post-disaster (responses phase), a path optimization algorithm based ant colony optimization (ACO) was used for minimizing the cost in post-disaster, simulation results show the effectiveness of the proposed methodology. This analysis was done by comparing it with clustering based algorithms under improvement ACO of tour improvement algorithm (TIA) and Min-Max Ant System (MMAS) and the results also show that the SSL based graph will be more effective for calculating the optimization path in PDMS. This research improved the map by combining the disaster map with the initial GIS based map which located the target area considering the influence of disaster. First, all initial map and disaster map will be under Gaussian transformation while we acquired the histogram of all map pictures. And then all pictures will be under discrete wavelet transform (DWT), a Gaussian fusion algorithm was applied in the DWT pictures. Second, inverse DWT (iDWT) was applied to generate a new map for a post-disaster management system. Finally, simulation works were proposed and the results showed the effectiveness of the proposed method by comparing it to other fusion algorithms, such as mean-mean fusion and max-UD fusion through the evaluation indices including entropy, spatial frequency (SF) and image quality index (IQI). Fuzzy set model were proposed to improve the presentation capacity of nodes in this GIS based PDMS.
Ph.D.
Doctorate
Industrial Engineering and Management Systems
Engineering and Computer Science
Industrial Engineering
APA, Harvard, Vancouver, ISO, and other styles
26

Mendes, Gonçalo Ramos. "Expansion strategy through mergers and acquisitions: an analysis to the case of SAPA building system." Master's thesis, 2016. http://hdl.handle.net/10071/13501.

Full text
Abstract:
JEL Classification: G30, G34
Mergers and acquisitions (M&A) are a growth vehicle which has being more and more utilized by companies, whether it is to expand their activity to new markets, to obtain synergies, increase market power, reduce costs or even to survive. The Portuguese context, being characterized by a vast majority of small and medium enterprises (SME’s), presents some obstacles to the realization of this kind of integration, but also quite favorable conditions to its concretization, proven by the growing number of operations verified. Therefore, it is important to analyze not only the context in which they occur, but the motives that lead managers and shareholders to opt more and more for this growth strategy, which advantages they seek to obtain, the several types of integration, the concept of synergy, the waves of M&A, the importance of a due diligence, and how the process should be managed. Additionally, the most active sectors both in number and volume will be addressed, as well as the main trends for the next years and how macroeconomic factors and market conditions affect the occurrence of these operations. Afterwards, an analysis of the specific case of the merger of Sapa Building System, inserted in a joint-venture created in 2013 by two of the biggest Norwegian conglomerates, Orkla and Hydro, will be conducted, focusing mainly on the effects of this operation in Portugal. The objectives established, synergies desired, results achieved until the moment, the challenges faced or the evolution of the aluminium industry are some of the topics addressed.
As fusões e aquisições são um veículo de crescimento cada vez mais utilizado pelas empresas, quer seja para expandir a sua atividade para novos mercados, obter sinergias, aumentar a influência na indústria, reduzir custos, ou até para sobreviver. O contexto Português, que se caracteriza pela esmagadora maioria de pequenas e médias empresas que compõem o tecido empresarial do país, apresenta alguns entraves à realização destas operações mas também condições bastante favoráveis à concretização das mesmas, verificando-se pelo número crescente desta concentração de organizações. Importa assim analisar não só o contexto em que ocorrem, mas os motivos que levam os gestores e acionistas a optar cada vez mais por esta estratégia de crescimento, que vantagens procuram obter, os diversos tipos de integração, o conceito de sinergia, o carácter cíclico com que ocorrem, a importância da due dilligence, e ainda de que forma deve ser gerido todo este processo. Adicionalmente, os setores mais ativos em volume e número também serão analisados, bem como as principais tendências para os próximos anos e de que forma fatores macroeconómicos e as condições de mercado podem potenciar a ocorrência destas operações. Posteriormente, uma análise do caso concreto da fusão da Sapa Building System, inserido numa joint-venture criada em 2013 por dois dos maiores conglomerados Noruegueses, Orkla e Hydro, será efetuada, focando principalmente os efeitos desta operação em Portugal. Os objetivos traçados, sinergias pretendidas, resultados alcançados até ao momento, desafios e a evolução da indústria do alumínio serão alguns dos temas abordados.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography