To see the other types of publications on this topic, follow the link: IoT Data Management.

Dissertations / Theses on the topic 'IoT Data Management'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'IoT Data Management.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Laamech, Nouha. "Towards a secure data sharing management approach for IoT environments." Electronic Thesis or Diss., Pau, 2024. http://www.theses.fr/2024PAUU3031.

Full text
Abstract:
Les environnements connectés promettent de nous apporter de nouveaux services : meilleure gestion de l'énergie, optimisation des transports, ciblage de l'information diffusée, etc. La grande valeur ajoutée de ces architectures à base d'objets connectés (IoT) est la donnée dont la collecte et le traitement impliquent un très grand nombre de systèmes informatiques opérés par des acteurs différents. Chaque acteur pouvant avoir ses propres objectifs, contraintes et enjeux, un des défis en terme de sécurité est de garder la maîtrise des informations échangées afin d'assurer ce que l'on appelle l'autodétermination informationnelle. Cette notion signifie par exemple que chaque acteur peut contrôler qui utilise ses données, où et pour quelle finalité.Cette thèse décrit notre approche pour la gestion du partage des données dans les environnements connectés. L'exploitation des données générées par les ressources de l'internet des objets soulève des risques de sécurité en raison du manque de transparence entre les différents acteurs de l'environnement. Ainsi, nous proposons tout d'abord une ontologie de gestion du partage des données IoT qui prend en considération les contextes, appelée IdSM-O, afin d'établir un vocabulaire de sécurité partagé et de gérer l'interopérabilité des environnements IoT. Ensuite, nous introduisons un gestionnaire de règles sémantiques automatique à trois niveaux, qui recueille les exigences des politiques de sécurité des fournisseurs de données et les traduit automatiquement en règles sémantiques prêtes pour le processus du raisonnement. Ces contributions constituent la base d'IdSM, un framework de sécurité de bout en bout pour la gestion du partage des données, qui répondent aux exigences de sécurité de l'information et au respect des obligations entre les différentes parties. Enfin, nous développons un prototype de la proposition afin de prouver sa faisabilité et d'analyser ses performances<br>Internet of Things (IoT) generates, connects and shares collected data from smart devices with various independent parties. With the increasing number of connected devices, its wide deployment is revolutionizing the modern world by covering almost every aspect of an individual's life. In this context, it is in the best interest of the community to successfully motivate users to share their IoT data with the rest of the environment, to allow the emergence of new services in different fields such as healthcare, education, or industrial manufacturing. However, requesting data to be able to extract valuable information from it can be a sensitive matter to approach. Therefore, framing requests and providing clarity on how this information will be used is necessary for building trust and credibility in connected environments. More precisely, when data providers decide to share their data with the community, they have little control over how their information are being used and in which context. In parallel, data consumers don't have the ability to trace back the different nodes by which the available data went through and its processing history to determine, for example, if it meets the technical and legal requirements of a given activity.Our research focus on three main challenges: (i) the definition of a semantic layer that handles the security requirements in the context of IoT data sharing, (ii) the enforcement of a context-aware security policy that matches both the data provider's preferences and the data consumer's usage, and (iii) the establishment of an end-to-end security solution that manage the sharing of IoT data in a decentralized architecture while eliminating the need to trust any involved IoT parties.To address these issues, we first present a context-aware IoT data Sharing Management ontology called IdSM-O, to establish a shared security vocabulary and handle the interoperability of IoT environments. Following that, we introduce a three-layer automatic semantic rule manager, that collects data provider's security policies requirements and automatically translate them to semantic rules ready for reasoning. Those contributions are the basement of IdSM, an end-to-end security framework for data sharing management during the phases of collection, transmission, and processing. Using this framework, we aim at addressing user's control enforcement over the owned smart devices, information security requirements, and obligation compliance between various parties in the IoT environment. Finally, we design, implement, and develop a prototype of the the proposal in order to prove its feasibility and analyze its performances
APA, Harvard, Vancouver, ISO, and other styles
2

Sellami, Youssef. "Secure data management in an IoT-Fog/Edge computing architecture." Electronic Thesis or Diss., Valenciennes, Université Polytechnique Hauts-de-France, 2024. https://ged.uphf.fr/nuxeo/site/esupversions/14bb8a1d-7fbb-4d10-a7e7-99650617c232.

Full text
Abstract:
L'internet des objets vise à intégrer les mondes physique et numérique dans un écosystème unique en interconnectant un grand nombre d'objets intelligents (capteurs, smartphones, véhicules autonomes, etc.) à l'internet. Cependant, la quantité massive de données est l'une des conséquences inévitables de la croissance du nombre d'objets connectés. L'évolution de l'IoT et de ses applications dans les années à venir (industrie 4.0, villes intelligentes) nécessite une gestion des données adaptée aux capacités limitées des objets connectés. De nouveaux paradigmes de traitement et de communication, tels que le fog/edge computing, sont à l'étude pour répondre aux attentes des applications et de leurs utilisateurs. Ces architectures utilisent des composants (routeurs, stations de base, machines utilisateurs, etc.) situés à proximité des objets et de l'utilisateur final. Cependant, ce couplage de l'IoT-Fog/Edge n'intègre pas encore de mécanismes de sécurité suffisamment robustes au regard des environnements de déploiement visés et des applications critiques qu'ils devront supporter. Cette thèse explore tout d'abord les architectures émergentes de l'IoT-edge/fog et met en évidence les différents défis et problèmes de sécurité posés par ce paradigme. L'un des problèmes critiques identifiés est la garantie de l'intégrité des données dans le fog. Malheureusement, les auditeurs tiers centralisés traditionnels sont inefficaces en raison de la latence élevée du réseau et des contraintes associées. Par conséquent, nous proposons un protocole de vérification publique efficace qui s'appuie sur le problème Short Integer Solution et sur des signatures basées sur l'identité. Ce nouveau protocole garantit l'intégrité et l'authenticité des données, autorise les modifications légitimes des données et permet une vérification distribuée de l'intégrité des données sans dépendre d'un tiers de confiance. Nous abordons également dans cette thèse la question de la fiabilité des données dans le fog, qui est cruciale pour la fiabilité des événements partagés entre les nœuds de fog et les sources de données. Une nouvelle solution basée sur la blockchain est présentée pour créer un environnement transparent et traçable permettant d'évaluer la fiabilité des données, de préserver les scores de confiance et d'encourager la responsabilité. Notre modèle calcule les scores de confiance sur la base de facteurs tels que la plausibilité de l'événement, la pertinence temporelle et la pertinence de la distance afin d'identifier efficacement les entités malveillantes et d'encourager un comportement digne de confiance. Enfin, nous avons mis l'accent dans cette thèse sur la protection de la confidentialité des données contre la menace quantique dans le contexte Edge/IoT. En outre, plusieurs schémas cryptographiques post-quantiques ont été proposés dans la littérature, visant à développer des techniques de cryptage résistantes à de telles attaques. En raison de ses propriétés de sécurité prometteuses, NTRU a été sélectionné comme candidat lors de la phase finale du concours du NIST sur la cryptographie post-quantique. Toutefois, cette méthode pose des problèmes pour les objets IoT en raison de ses exigences potentiellement plus élevées en matière de calcul et de mémoire. Motivés par la nécessité d'augmenter la durée de vie des objets IoT tout en étant capables de résister aux attaques quantiques, nous proposons un nouveau schéma collaboratif basé sur NTRU. Notre schéma préserve la confidentialité des données échangées entre les objets IoT déployés dans l'architecture edge. En outre, notre schéma répartit la charge des opérations cryptographiques entre les nœuds du edge et les objets IoT. Cette approche collaborative permet aux objets IoT de réduire considérablement leurs coûts de calcul tout en garantissant la confidentialité des données. En outre, la répartition proposée du calcul permet l'évolutivité de l'architecture et améliore la durabilité des environnements IoT<br>The Internet of Things (IoT) aims to integrate the physical and digital worlds into a single ecosystem by interconnecting a large number of intelligent objects (sensors/actuators, smartphones, autonomous vehicles, etc.), to the internet. However, the massive amount of data is one of the inevitable consequences of the exponential growth in the number of connected objects. The evolution of the IoT and its applications in the years to come (industry 4.0, smart cities, intelligent transport) requires data management adapted to the limited capacities of connected objects. New processing and communication paradigms, such as fog or edge computing, are being studied to meet the expectations of applications and their users.These architectures use components (such as routers, base stations, user machines, etc.) located in close proximity to objects and end-user. However, this technological coupling of IoT and Fog/Edge computing does not yet incorporate sufficiently robust security mechanisms in view of the targeted deployment environments and the critical applications they will have to support.This thesis first explores the emerging IoT-edge and fog computing architectures and highlights the various security challenges and issues posed by this new paradigm. One of the critical problems identified is guaranteeing data integrity in the highly dynamic and distributed environment of fog computing. Unfortunately, the traditional centralized third-party auditors are ineffective due to high network latency and associated constraints. Therefore, to solve this issue, we propose an efficient public verification protocol leveraging the Short Integer Solution (SIS) problem and identity-based signatures. This new protocol ensures data integrity and authenticity, allows for legitimate data modifications, and enables distributed data integrity verification without relying on a trusted third party.Furthermore, we address in this thesis the data trustworthiness in fog computing systems, which is crucial for the reliability of events shared between fog nodes and data sources. A novel Blockchain-based solution is presented to create a transparent, traceable environment for evaluating event trustworthiness, preserving trust scores, and fostering accountability. Our model calculates trust scores based on factors such as event plausibility, temporal relevance and distance relevance to effectively identify malicious entities and encourage trustworthy behavior.Finally, we focused in this thesis on the protection of data confidentiality against the quantum threat in the Edge/IoT context. In addition, several post-quantum cryptographic schemes have been proposed in the literature, aiming to develop encryption techniques resistant to such attacks.Due to its promising security properties and efficiency against quantum attacks, NTRU was selected as a candidate in the final round of the NIST competition on post-quantum cryptography. However, this method poses challenges for constrained devices due to its potentially higher computational and memory requirements. Motivated by the necessity to increase the lifetime of resource-constrained IoT devices while being able to resist quantum attacks, we propose a new NTRU-based collaborative scheme. Our scheme preserves the confidentiality of sensitive information exchanged among constrained IoT devices deployed in an edge computing architecture. Moreover, it distributes the workload of the cryptographic operations across edge nodes and IoT devices within the same network. This collaborative approach allows IoT devices to significantly reduce their computational costs while guaranteeing data confidentiality. Furthermore, the proposed distribution of computing enables scalability of the architecture and improves the sustainability of IoT environments.Keywords: Fog computing, Edge computing, IoT, Data integrity, Security, Lattice-based cryptography, SIS problem, NTRU, Trust management, Confidentiality
APA, Harvard, Vancouver, ISO, and other styles
3

Kandi, Mohamed Ali. "Lightweight key management solutions for heterogeneous IoT." Thesis, Compiègne, 2020. http://www.theses.fr/2020COMP2575.

Full text
Abstract:
L'Internet des objets (IdO) est une technologie émergente ayant le potentiel d'améliorer notre quotidien de différentes façons. Elle consiste à étendre la connectivité au-delà des appareils standards (tels que les ordinateurs, les tablettes et les smartphones) à tous les objets du quotidien. Ces appareils, également appelés objets intelligents, peuvent alors collecter des données de leur entourage, collaborer pour les traiter puis agir sur leur environnement. Cela augmente leurs fonctionnalités et leur permet d'offrir divers services au profit de la société. Cela dit, de nombreux défis ralentissent le développement de l'IdO. La sécurisation des communications entre ces appareils est l'un des problèmes les plus difficiles qui empêche cette technologie de révéler tout son potentiel. La cryptographie fournit un ensemble de mécanismes permettant de sécuriser les données. Pour leur bon fonctionnement, ces derniers ont besoin de paramètres secrets appelés clés. La gestion des clés est une branche de la cryptographie qui englobe toutes les opérations impliquant la manipulation de ces clés : génération, stockage, distribution et remplacement. Par ailleurs, la cryptographie légère consiste à étendre les mécanismes conventionnels (la gestion des clés comprise) aux appareils à ressources limitées. Afin d'être efficaces dans l'IdO, les nouveaux mécanismes doivent offrir un bon compromis entre sécurité, performance et consommation de ressources. La gestion légère des clés est donc l'essence de la communication sécurisée dans l'IdO et le cœur de notre travail. Dans cette thèse, nous proposons un nouveau protocole léger de gestion des clés pour sécuriser la communication entre les appareils hétérogènes et dynamiques de l'IdO. Pour concevoir notre solution, nous considérons trois modes de communication : d'appareil à appareil, de groupe et de multi-groupes. Alors que la plupart des travaux connexes se concentrent uniquement sur l'un de ces modes de communication, notre solution sécurise efficacement les trois. Aussi, elle équilibre automatiquement les charges entre les appareils hétérogènes en fonction de leurs capacités. Nous prouvons alors que cela rend notre protocole plus adapté à l'IdO étant donné qu'il est efficace et hautement évolutif. De plus, nous proposons une décentralisation de notre protocole basée sur la technologie blockchain et les contrats intelligents. Ainsi, nous montrons qu'en permettant à plusieurs participants de gérer les clés cryptographiques, la décentralisation résout les problèmes de confiance, réduit le risque de défaillance du système et améliorer la sécurité. Nous implémentons enfin notre solution sur des plateformes IoT à ressources limitées qui sont basées sur le système d'exploitation Contiki. L'objectif est d'évaluer expérimentalement les performances de notre solution et de compléter nos analyses théoriques<br>The Internet of Things (IoT) is an emerging technology that has the potential to improveour daily lives in a number of ways. It consists of extending connectivity beyond standard devices (such as computers, tablets and smartphones) to all everyday objects. The IoT devices, also called smart objects, can collect data from their surroundings, collaborate to process them and then act on their environment. This increases their functionalities and allow them to offer various services for the benefit of society. However, many challenges are slowing down the development of the IoT. Securing communication between its devices is one of the hardest issue that prevents this technology from revealing its full potential. Cryptography provides a set of mechanisms to secure data. For their proper functioning, these mechanisms require secret parameters called keys. The Key Management is a branch of cryptography that encompasses all operations involving the handling of these of extending the conventional mechanisms (including the Key Management) to the resource-limited devices. To be efficient in the IoT, the new mechanisms must offer a good compromise between security, performance and resource requirements. Lightweight Key Management is the essence of secure communication in the IoT and the core of our work. In this thesis, we propose a novel lightweight Key Management protocol to secure communication between the heterogeneous and dynamic IoT devices. To design our solution, we consider three modes of communication: device-to-device, group and multi-group communication. While most of the related works focus only on one of these modes of communication, our solution efficiently secures all three of them. It also automatically balances the loads between the heterogeneous devices according to their capabilities. We then prove that this makes our protocol more suitable for the IoT as it is e_cient and highly scalable. Furthermore, we propose a decentralization of our protocol based on the blockchain technology and smart contracts. We show that, by empowering multiple participants to manage the cryptographic keys, decentralization solves trust issues, lowers risk of system failure and improves security. We finally implement our solution on resource-constrained IoT motes that are based on the Contiki operating system. The objective is to experimentally evaluate the performance of our solution and to complete our theoretical analyses
APA, Harvard, Vancouver, ISO, and other styles
4

Sridharan, Vaikunth. "Sensor Data Streams Correlation Platform for Asthma Management." Wright State University / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=wright1527546937956439.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Klasson, Anders, and Johan Rosengren. "Industrial IoT Management Systemfor Tubes with Integrated Sensors." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-237412.

Full text
Abstract:
Sandvik har utvecklat en teknik för att placera sensorer i rör. Denna teknik har stor marknadspotential och kan effektivisera många industriprocesser. Den färdiga tjänsten ska kunna strömma sensordata till molntjänster för analys och avläsning.Deras nuvarande system kräver idag manuell konfiguration på plats och är komplicerad att installera. Denna uppsats undersöker hur systemets utrustning kan konfigureras automatiskt och hur ett system för underliggande IT-tjänster skulle kunna fungera.En lösning presenteras där många delar av installationsprocessen har automatiserats, samt en skiss för ett underliggande system.Lösningen utvärderas genom att utföra en mätning av konfigureringskomplexitet. Slutsatsen av utvärderingen var att det utvecklade system hade utökad funktionalitet, jämfört med dagens manuella tillvägagångssätt, och var inte mer komplex att konfigurera. I många avseenden mindre komplex.<br>Sandvik has developed a technique to place sensors inside tubes. This technology has great market potential and can optimize many industrial processes. The finished product should be able to stream sensor data to cloudservices for analysis and reading.The current system requires manual configuration on-site and the installation is labor intensive. This thesis investigates how the system’s hardware can be configured atomically, and how a supporting IT-system could function.A solution is presented where large portion of the installation process has been automated, along with an outline for a supporting system.The solution is evaluated by performing a measurement of the configuration complexity. The evaluation shows that the developed system had increased functionality compared to today’s manual configuration, configuration complexity was not increased. In many aspects, the configuration complexity was reduced.
APA, Harvard, Vancouver, ISO, and other styles
6

Minardi, Sara. "Processamento ed analisi di open data IoT mediante algoritmi di classificazione." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2019.

Find full text
Abstract:
Il continuo sviluppo delle tecnologie degli ultimi anni ha permesso ad un insieme sempre più ampio di dispositivi di connettersi e scambiare informazioni tra loro e con l’ambiente esterno. I collegamenti sono costruiti attraverso l’utilizzo della rete, dando forma ad un nuovo paradigma tecnologico che prende il nome Internet of Things. Gli oggetti materiali che interagiscono nel sistema sono definiti smart objects, ovvero oggetti intelligenti perché in grado di interagire con altri dispositivi e con il mondo esterno. In rete esistono spazi che raccolgono le grandi quantità di dati che si generano, e a seconda della tipologia di accesso possono essere identificati come Open Data. Un dato per essere aperto deve essere facilmente accessibile ed utilizzabile da tutti. Un dato aperto è un dato a cui si può accedere facilmente in maniera gratuita ed è impostato per essere potenzialmente utilizzabile da tutti. Grazie all’implementazione di tecniche di processamento e analisi è possibile trasformare i dati grezzi in dati di valore. Il progetto di questa tesi consiste nella realizzazione e valutazione degli algoritmi di diverse strutture basati sulle tecniche del Data Mining, al fine trovare un modello di classificazione per dati eterogenei e quindi applicabile ad un dataset di Open Data.
APA, Harvard, Vancouver, ISO, and other styles
7

Mezghani, Emna. "Towards Autonomic and Cognitive IoT Systems, Application to Patients’ Treatments Management." Thesis, Toulouse, INSA, 2016. http://www.theses.fr/2016ISAT0016/document.

Full text
Abstract:
Dans cette thèse, nous proposons une méthodologie basée sur les modèles pour gérer la complexité de la conception des systèmes autonomiques cognitifs intégrant des objets connectés. Cette méthodologie englobe un ensemble de patrons de conception dont nous avons défini pour modéliser la coordination dynamique des processus autonomiques pour gérer l’évolution des besoins du système, et pour enrichir les systèmes avec des propriétés cognitives qui permettent de comprendre les données et de générer des nouvelles connaissances. De plus, pour gérer les problèmes reliés à la gestion des big data et à la scalabilité du système lors du déploiement des processus, nous proposons une plate-forme sémantique supportant le traitement des grandes quantités de données afin d’intégrer des sources de données distribuées et hétérogènes déployées sur le cloud pour générer des connaissances qui seront exposées en tant que service (KaaS). Comme application de nos contributions, nous proposons un système cognitif prescriptif pour la gestion du plan de traitement du patient. Ainsi, nous élaborons des modèles ontologiques décrivant les capteurs et le contexte du patient, ainsi que la connaissance médicale pour la prise de décision. Le système proposé est évalué de point de vue clinique en collaborant avec des experts médicaux, et de point de vue performance en proposant des différentes configurations dans le KaaS<br>In this thesis, we propose a collaborative model driven methodology for designing Autonomic Cognitive IoT systems to deal with IoT design complexity. We defined within this methodology a set of autonomic cognitive design patterns that aim at (1) delineating the dynamic coordination of the autonomic processes to deal with the system's context changeability and requirements evolution at run-time, and (2) adding cognitive abilities to IoT systems to understand big data and generate new insights. To address challenges related to big data and scalability, we propose a generic semantic big data platform that aims at integrating heterogeneous distributed data sources deployed on the cloud and generating knowledge that will be exposed as a service (Knowledge as a Service--KaaS). As an application of the proposed contributions, we instantiated and combined a set of patterns for the development of prescriptive cognitive system for the patient treatment management. Thus, we elaborated two ontological models describing the wearable devices and the patient context as well as the medical knowledge for decision-making. The proposed system is evaluated from the clinical prescriptive through collaborating with medical experts, and from the performance perspective through deploying the system within the KaaS following different configurations
APA, Harvard, Vancouver, ISO, and other styles
8

Pisanò, Lorenzo. "IoT e Smart Irrigation: gestione dei Big Data attraverso un sistema di notifica intelligente." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/23531/.

Full text
Abstract:
Con questo lavoro di tesi ho potuto approfondire anche un altro tema di grande attualità collegato ad IoT, la 'Smart Irrigation', conosciuta anche come 'Irrigazione di precisione'. Considerando la necessità sempre più evidente di migliorare la gestione della distribuzione irrigua ed energetica nel campo dell'agricoltura e, tenendo presente le indicazione meteoclimatiche e, l'importanza di avere informazioni tempestive ed aggiornate per migliorare le attività in campo, la Smart Irrigation assume un ruolo rilevante nel risparmio idrico ed energentico, evitando sprechi ed usi impropri di queste preziose risorse. Il software che ho realizzato è stato sviluppato nell'ambito di un programma europeo più vasto, il progetto SWAMP (Smart WAter Management Platform), che ha come obiettivo quello di determinare una svolta decisiva nell'utilizzo moderato e privo di sprechi dell' acqua dolce ad uso irriguo, proponendo un sistema efficiente per la gestione della distribuzione di questo bene in vari contesti. L'area di competenza del progetto fa parte di quella amministrata dal Consorzio di Boni�ca dell'Emilia Centrale (CBEC), responsabile delle irrigazioni e del drenaggio d'acqua di un'area di 1200 km2 suddivisi in circa 5400 terreni proprietari. Il software di seguito descritto genera un sistema di acquisizione dati provenienti da alcuni pluviometri dislocati nel comune di Bologna. Successivamente, li elabora e classi�fica la quantità di pioggia che cade nell'area di studio in 5 differenti livelli di rischio. Queste informazioni vengono poi noti�cate all'utente attraverso la piattaforma WDA, permettendo di ovviare ad eventi di inondazione e alluvione anche nelle aree adiacenti a quelle classi�cate 'a rischio'.
APA, Harvard, Vancouver, ISO, and other styles
9

Ismaili-Alaoui, Abir. "Methodology for an Augmented Business Process Management in IoT Environment." Electronic Thesis or Diss., Université de Lorraine, 2022. http://www.theses.fr/2022LORR0232.

Full text
Abstract:
Les processus métiers peuvent être vu comme une collection d'activités exécutées et coordonnées afin de produire un résultat bien spécifique, répondant aux besoins d'un client (interne et/ou externe). La gestion des processus métiers (Business process management - BPM) est un domaine de recherche très actif dans la discipline des systèmes d'informations. Il a pour objectif l'amélioration continue de l'efficacité et la performance des processus d'une entreprise, par le biais des méthodes, des techniques et des outils qu'il offre afin d'appuyer la conception, la mise en œuvre, la gestion, l'analyse, et l'automatisation, dans la mesure du possible, des processus métier, et donc gagner en termes d'agilité,de flexibilité et de performance. Même si plusieurs méthodes d'amélioration des processus métier (Business Process Improvement- BPI) sont disponibles dans la littérature, les organisations rencontrent toujours des difficultés pour les appliquer efficacement. Ces difficultés peuvent être justifiées par le fait que les méthodes BPI existantes ne répondent pas à toutes les exigences récentes des organisations et ne s'adaptent pas aux progrès réalisés, ces dernières années, dans plusieurs domaines tels que l'intelligence artificielle, les techniques d'analyse des données, l'apprentissage automatique, le process mining et le traitement des (flux) événements, etc. En outre, avec cette nouvelle ère de digitalisation et l'essor de plusieurs nouvelles technologies telles que le Big Data, l'Internet des objets (IoT), le Cloud Computing, etc, les organisations sont confrontées à de nouveaux facteurs et défis redéfinissant le marché et qui génèrent de réels changements dans le BPM traditionnel. Parmi ces nouveaux défis on trouve la quantité de données et d'événements, provenant, avec une très grande vélocité,de différentes sources hétérogènes (des interactions internes ou externes de l'entreprise,IoT, etc). Ces données doivent être bien analysées et exploitées afin d'en extraire, des résultats à forte valeur ajoutée qui peuvent aider l'entreprise dans son processus de prise de décision. Cependant, les outils traditionnels proposés par la méthode du management des processus métiers présentent différentes limites concernant le traitement, la fouille et l'analyse des données et l'exploitation des résultats de ces analyses en temps réel. La nature interdisciplinaire du BPM est un facteur clé qui favorise les perspectives d'amélioration dans ce domaine. L'objectif de ce travail de thèse est de proposer de nouvelles approches pour augmenter les processus métier, en s'appuyant principalement sur l'analyse des données, les algorithmes d'apprentissage automatique et le traitement des événements complexes, afin d'exploiter les données et événements générés par l'exécution des processus métier et de trouver des moyens d'améliorer ces processus sous différents angles tels que l'ordonnancement des instances et la gestion des événements dans un environnement IoT. L'loT est en train de devenir une zone d'innovations technologiques et de promesses de développement économique pour de nombreuses industries et services. Ce nouveau changement de paradigme affecte toutes les couches de l'architecture d'entreprise, de l'infrastructure au métier. Le Business Process Management (BPM) est un domaine parmi d'autres qui est affecté par cette nouvelle technologie. Pour faire face à l'explosion des données et des événements résultant, entre autres, de l'loT, les processus d'analyse de données combinés aux techniques de traitement des événements, examinent de grands ensembles de données pour découvrir des modèles cachés, des corrélations inconnues entre les événements collectés, soit à un niveau très technique (détection des incidents/anomalies, maintenance prédictive), soit au niveau métier (préférences des clients, tendances du marché, opportunités de revenus) pour fournir une meilleure efficacité opérationnelle, un meilleur service client et des avantages [...]<br>Business Processes (BP) can be seen as a collection of activities executed and coordinated in order to produce a specific result and to meet the needs of a customer (internal and/or external). Business process management (BPM) is a very active research area, its objective is to provide a comprehensive and insightful analysis of the product flow and to identify inefficiencies and potential improvement areas in the process to achieve better decision making and results. Although we can find several Business Process Improvement (BPI) methods in the literature, organizations are still facing some difficulties to apply these methods effectively. These difficulties can be justified by the fact that the existing BPI methods do not fi twith all the recent requirements of the organizations and the progress that the world has achieved, in the past few years, in several domains such as Artificial Intelligence, Data Analytics techniques, Machine Learning, Process Mining, and Event (Stream) Processing, etc. Besides, with this new digitized era and the rise of several new technologies such as Big Data, Internet of things, Cloud computing, etc, organizations are faced with many factors and challenges that generate real changes in the traditional BPM. Among these challenges, we have the huge amount of data and event data that are continuously gathered within the organization. These data represent a real engine of growth for organizations, and must be adequately exploited to extract high added value that can assist the organization in its decision making process. Furthermore, enterprises are looking for advanced technologies that optimize time and resources and increase agility, productivity and most importantly, proactivity. However, traditional BPM systems present different limits, as they do not facilitate the use of knowledge extracted from this data, by business processes, because they do not benefit from statistical functionalities and data analysis and manipulation techniques in real time.The interdisciplinary nature of BPM is a key factor that fosters opportunities for improvement in this domain. The objective of this thesis work is to propose new approaches for augmenting business processes, by relying mainly on data analysis, machine learning algorithms, and complex event processing, to exploit the data generated by business process execution (event data, event logs) and find ways to improve these processes from different perspectives such as instances scheduling and event management in an IoT environment
APA, Harvard, Vancouver, ISO, and other styles
10

Sgarbi, Andrea. "Machine Cloud Connectivity: a robust communication architecture for Industrial IoT." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020.

Find full text
Abstract:
Industry 4.0 springs from the fourth industrial revolution, which is bringing innovation to fully automated and interconnected industrial production. This movement is composed of macro areas to expand the technological horizon starting from the tools used to date. The use of data, computing power and connectivity are the fundamental concepts on which the study of this thesis is based and are declined in big data, open data, Internet of Things (IoT), machine-to-machine and cloud computing for the centralization of information and its storage. Once the data has been collected, it is necessary to derive value from it in order to obtain advantages from ”machine learning”, i.e. machines that improve their performance by ”learning” from the data collected and analyzed. The advent of the Internet of Things can be seen in all respects as the greatest technological revolution of recent years, which will bring a huge amount of information into the hands of users. The latter can offer countless advantages in daily life and in the diagnostics of the production process. Industrial IoT (IIoT) enables manufacturing organizations to create a communication path through the automation pyramid, obtaining a real data stream in order to improve the machine performances. From an information security point of view, the importance of the information transmitted should not be underestimated and this also concerns an important aspect of industry 4.0. Protocols and authentication systems are constantly updated to ensure the privacy and security the customer needs. Through this thesis project, the implementation requirements will be dealt with in order to study and analyze different vendor technologies and to construct a cloud architecture. The focus is concentrated on the cybersecurity and on the information losses avoidance in order to get a robust transfer.
APA, Harvard, Vancouver, ISO, and other styles
11

Rahman, Hasibur. "Distributed Intelligence-Assisted Autonomic Context-Information Management : A context-based approach to handling vast amounts of heterogeneous IoT data." Doctoral thesis, Stockholms universitet, Institutionen för data- och systemvetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-149513.

Full text
Abstract:
As an implication of rapid growth in Internet-of-Things (IoT) data, current focus has shifted towards utilizing and analysing the data in order to make sense of the data. The aim of which is to make instantaneous, automated, and informed decisions that will drive the future IoT. This corresponds to extracting and applying knowledge from IoT data which brings both a substantial challenge and high value. Context plays an important role in reaping value from data, and is capable of countering the IoT data challenges. The management of heterogeneous contextualized data is infeasible and insufficient with the existing solutions which mandates new solutions. Research until now has mostly concentrated on providing cloud-based IoT solutions; among other issues, this promotes real-time and faster decision-making issues. In view of this, this dissertation undertakes a study of a context-based approach entitled Distributed intelligence-assisted Autonomic Context Information Management (DACIM), the purpose of which is to efficiently (i) utilize and (ii) analyse IoT data. To address the challenges and solutions with respect to enabling DACIM, the dissertation starts with proposing a logical-clustering approach for proper IoT data utilization. The environment that the number of Things immerse changes rapidly and becomes dynamic. To this end, self-organization has been supported by proposing self-* algorithms that resulted in 10 organized Things per second and high accuracy rate for Things joining. IoT contextualized data further requires scalable dissemination which has been addressed by a Publish/Subscribe model, and it has been shown that high publication rate and faster subscription matching are realisable. The dissertation ends with the proposal of a new approach which assists distribution of intelligence with regard to analysing context information to alleviate intelligence of things. The approach allows to bring few of the application of knowledge from the cloud to the edge; where edge based solution has been facilitated with intelligence that enables faster responses and reduced dependency on the rules by leveraging artificial intelligence techniques. To infer knowledge for different IoT applications closer to the Things, a multi-modal reasoner has been proposed which demonstrates faster response. The evaluations of the designed and developed DACIM gives promising results, which are distributed over seven publications; from this, it can be concluded that it is feasible to realize a distributed intelligence-assisted context-based approach that contribute towards autonomic context information management in the ever-expanding IoT realm.<br><p>At the time of the doctoral defense, the following paper was unpublished and had a status as follows: Paper 7: Submitted.</p>
APA, Harvard, Vancouver, ISO, and other styles
12

Ralambotiana, Miora. "Key management with a trusted third party using LoRaWAN protocol : A study case for E2E security." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-230671.

Full text
Abstract:
Nowadays, Internet of Things (IoT) applications are gaining more importance in people’s everyday life. Depending of their usage (for long or short distance communications, using low or high power devices, etc.), several standards exist. In this study, the focus is on Low Power Wide Area Networks (LPWAN) and particularly a protocol which is raising in popularity for long-range low-power communications in IoT: LoRaWAN. LoRaWAN is still at an early stage and has been mainly used in use cases where the network server was managing the keys ensuring confidentiality and integrity of the data. Gemalto has raised the issue of interest conflicts in the case where the network operator and the application provider are two distinct entities: if the end-device and the application server are exchanging sensitive data, the network server should not be able to read them. In order to solve this problem, an architecture using a trusted third party to generate and manage the keys has been implemented during this project. The following research aims at finding security threats and weaknesses on the confidentiality and integrity of the data and devices’ authentication in this study case. The LoRaWAN protocol and key management in general were studied first before describing the studied system and finding the possible attacks exploring its vulnerabilities on the mentioned points via an attack tree. These attacks were simulated in order to define their consequences on the system and according to them, security improvements on the architecture was proposed based on previous work on the topic and exploration on potential countermeasures.<br>Idag blir Internet av saker (IoT) applikationer allt viktigare i människors vardag. Beroende på användningen (för långeller kortdistanskommunikation, med låga eller höga effektenheter etc.) finns flera standarder. I denna studie ligger fokus på Low Power Wide Area Networks (LPWAN) och i synnerhet ett protokoll som ökar i popularitet för långsiktig lågkapacitetskommunikation i IoT: LoRaWAN. LoRaWAN är fortfarande på ett tidigt stadium och har i huvudsak använts i användarfall där nätverksservern hanterade nycklarna som säkerställer konfidentialitet och integritet av data. Gemalto har tagit upp frågan om intressekonflikter i det fall nätverksoperatören och programleverantören är två separata enheter: Om slutanordningen och applikationsservern utbyter känslig data, ska nätverksservern inte kunna läsa dem. För att lösa detta problem har en arkitektur som använder en betrodd tredje part för att generera och hantera nycklarna implementerats under det här projektet. Följande forskning syftar till att hitta säkerhetshot och svagheter om konfidentialiteten och integriteten hos data och enheternas autentisering i detta studiefall. LoRaWAN-protokollet och nyckelhanteringen i allmänhet kommer att studeras först innan författaren beskriver det studerade systemet och upptäcker de eventuella attacker som undersöker sårbarheten på de nämnda punkterna via ett angreppsträd. Dessa attacker kommer att simuleras för att definiera deras konsekvenser på systemet och enligt dem kommer säkerhetsförbättringar på arkitekturen att föreslås utifrån tidigare arbete med ämnet och undersökning av potentiella motåtgärder
APA, Harvard, Vancouver, ISO, and other styles
13

De, Giosa Matteo. "Progettazione e validazione di un framework di algoritmi ensemble per la classificazione di Open Data IoT." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/19600/.

Full text
Abstract:
La quantità di dati IoT liberamente accessibili in rete - comunemente chiamati Open Data - è potenzialmente di grande utilità per innumerevoli applicazioni pratiche. Tuttavia, tali dati sono spesso inutilizzabili o incomprensibili, al punto in cui talvolta non si riesce nemmeno a discernere la tipologia di osservazione effettuata. Per etichettare tali misurazioni è dunque necessaria l’applicazione di modelli di classificazione. Questo tuttavia non è un lavoro semplice, in quanto i dati open sono in generale molto eterogenei, per cui molti degli algoritmi comunemente usati in letteratura hanno difficoltà a classificarli correttamente. Il contributo maggiore di questa tesi è perciò la presentazione di MACE, un framework ensemble per la classificazione di Open Data IoT: dopo averne trattato progettazione ed implementazione, ne valuteremo le performance, dimostrando la sua efficacia nel risolvere quello che è, ad oggi, un problema decisamente trascurato dalla letteratura.
APA, Harvard, Vancouver, ISO, and other styles
14

Karlstedt, Johan M. "An ISD study of Extreme Information Management challenges in IoT Systems - Case : The “OpenSenses”eHealth/Smarthome project." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4821.

Full text
Abstract:
Context: Internet of Things (IoT) is an exciting new development and opportunity in the global ICT field. In short, IoT means that all objects will have their own IP address and be constantly online, connected to other similar actors. After this expansion Internet will not only be humans communicate with humans (H2H) more but machines will automatically communicate with machines (M2M) and humans (H2M) as well . The already today big amounts of data will become extreme as a result of this and it will require new and smarter ways of designing future ICT systems. Many more things needs to be taken into account for these demanding ICT development projects to fall out in productive ways. As of right now it does not seem to be clear to developers what all matters need to be looked at and in that order and why. Clear IoT systems design guidelines, templates and view-models are missing or they are at least not available to the mainstream designers. Objectives: In this study the objective is to investigate a) what views and concepts would need to be taken into account for developers to design better Information Systems for IoT type of setups and b) what kind of an ISD framework such design could result in. Methods: In this thesis a number of research methods have been used, such as: a) conducting a broad literature review on related IS concepts; b) analyzing how the found concepts tie togehter; c) compiling an ISD framework based on the concepts and 2) apply the ISD framework on a real eHealth case. Results: The result of the study was that there is a need for a new ISD framework for larger systems design based on a stakeholder centric viewpoint. This work puts forward an example of such an ISD framework. Conclusions: Designing and developing large Information Systems for IoT setups is a very demanding task and a logical/clear design frameworks will help in this matter. If nothing else, a predefined view-model will help in asking the right (kind) of questions and keeping the communication about the objectives on track and focused. eHealth is a field where IoT systems should be implemented but if it&apos;s not done in the correct way taking matters like stakeholders objectives and IPR&apos;s on both systems and data into account very little productive results will be achieved.
APA, Harvard, Vancouver, ISO, and other styles
15

Floderus, Sebastian, and Vincent Tewolde. "Analysing privacy concerns in smartcameras : in correlation with GDPR and Privacy by Design." Thesis, Blekinge Tekniska Högskola, Institutionen för datavetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-21980.

Full text
Abstract:
Background. The right to privacy is every persons right, data regulation laws suchas the GDPR and privacy preserving concepts like Privacy by Design (PbD) aid inthis matter. IoT devices are highly vulnerable to attacks because of their limitedstorage and processing capabilities, even more so for internet connected cameras.With the use of security auditing techniques and privacy analysis methods it ispossible to identify security and privacy issues for Internet of Things (IoT) devices. Objectives. The research aims to evaluate three selected IoT cameras’ ability toprotect privacy of their consumers. As well as investigating the role GDPR and PbDhas in the design and operation of each device. Methods. A literature review was performed in order to gain valuable knowledgeof how to design a case study that would evaluate privacy issues of IoT devices incorrelation with GDPR and PbD. The case study consists of 14 cases designed toexplore security and privacy related issues. They were executed in a monitored andcontrolled network environment to detect data flow between devices. Results. There was a noticeable difference in the security and privacy enhancingtechnologies used between some manufactures. Furthermore, there was a distinctdisparity of how transparent each system was with the processed data, which is acrucial part of both GDPR and PbD. Conclusions. All three companies had taken GDPR and PbD into considerationin the design on the IoT systems, however to different extents. One of the IoTmanufactures could benefit from incorporating PbD more thoroughly into the designand operation of their product. Also the GDPR could benefit from having referencesto security standards and frameworks in order simplify the process for companies tosecure their systems.
APA, Harvard, Vancouver, ISO, and other styles
16

Lasciarrea, Luca. "La quarta rivoluzione industriale." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2018.

Find full text
Abstract:
Quando si parla di rivoluzione industriale si intende un momento preciso che rappresenta l’avvio e segna il punto di non ritorno: è il momento in cui il cambiamento, che fino ad allora si è mantenuto sottotraccia, diventa improvvisamente evidente e si pone come fenomeno inarrestabile, destinato a cambiare per sempre il quadro di riferimento per tutti, non solo specialisti ed addetti ai lavori. L’obbiettivo di questo elaborato è inquadrare sotto termini economici e sociali l’impatto che la rivoluzione dei giorni nostri sta producendo, cercando di analizzare le possibili conseguenze sull’occupazione dovute dalla continua introduzione di nuove tecnologie. A tal proposito viene concesso largo spazio alle tecnologie abilitanti, ovvero le nuove tecnologie che sono e saranno il perno di questa svolta epocale. Passando dai veicoli autonomi alla stampa 3D, dai Big Data al Fog Computing, dalla biologia di sintesi allo studio sui genomi, le tecnologie vengono suddivise in tre macro sfere: fisica, digitale e biologica. Punto di incontro delle tecnologie abilitanti è sicuramente il dato, che è passato dall’ essere una semplice informazione nata e morta ad uno dei principali asset per le aziende moderne. Il dato è caratterizzato da valore d’uso, come la forza lavoro e si trasforma in valore di scambio all’interno di contesti di produzione in grado di utilizzare la tecnologia algoritmica appropriata. Tale processo, però, è lontano dall’essere preciso ed omogeneo. Partendo da queste considerazioni, all’interno di questo elaborato si è provato a formulare una teoria sul valore di scambio prodotto dal dato. In conclusione si sono analizzati gli aspetti etici e la nascita delle continue problematiche relative alla privacy in relazione alla grande mole di dati che ogni giorno produciamo.
APA, Harvard, Vancouver, ISO, and other styles
17

Tania, Zannatun Nayem. "Machine Learning with Reconfigurable Privacy on Resource-Limited Edge Computing Devices." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-292105.

Full text
Abstract:
Distributed computing allows effective data storage, processing and retrieval but it poses security and privacy issues. Sensors are the cornerstone of the IoT-based pipelines, since they constantly capture data until it can be analyzed at the central cloud resources. However, these sensor nodes are often constrained by limited resources. Ideally, it is desired to make all the collected data features private but due to resource limitations, it may not always be possible. Making all the features private may cause overutilization of resources, which would in turn affect the performance of the whole system. In this thesis, we design and implement a system that is capable of finding the optimal set of data features to make private, given the device’s maximum resource constraints and the desired performance or accuracy of the system. Using the generalization techniques for data anonymization, we create user-defined injective privacy encoder functions to make each feature of the dataset private. Regardless of the resource availability, some data features are defined by the user as essential features to make private. All other data features that may pose privacy threat are termed as the non-essential features. We propose Dynamic Iterative Greedy Search (DIGS), a greedy search algorithm that takes the resource consumption for each non-essential feature as input and returns the most optimal set of non-essential features that can be private given the available resources. The most optimal set contains the features which consume the least resources. We evaluate our system on a Fitbit dataset containing 17 data features, 4 of which are essential private features for a given classification application. Our results show that we can provide 9 additional private features apart from the 4 essential features of the Fitbit dataset containing 1663 records. Furthermore, we can save 26:21% memory as compared to making all the features private. We also test our method on a larger dataset generated with Generative Adversarial Network (GAN). However, the chosen edge device, Raspberry Pi, is unable to cater to the scale of the large dataset due to insufficient resources. Our evaluations using 1=8th of the GAN dataset result in 3 extra private features with up to 62:74% memory savings as compared to all private data features. Maintaining privacy not only requires additional resources, but also has consequences on the performance of the designed applications. However, we discover that privacy encoding has a positive impact on the accuracy of the classification model for our chosen classification application.<br>Distribuerad databehandling möjliggör effektiv datalagring, bearbetning och hämtning men det medför säkerhets- och sekretessproblem. Sensorer är hörnstenen i de IoT-baserade rörledningarna, eftersom de ständigt samlar in data tills de kan analyseras på de centrala molnresurserna. Dessa sensornoder begränsas dock ofta av begränsade resurser. Helst är det önskvärt att göra alla insamlade datafunktioner privata, men på grund av resursbegränsningar kanske det inte alltid är möjligt. Att göra alla funktioner privata kan orsaka överutnyttjande av resurser, vilket i sin tur skulle påverka prestanda för hela systemet. I denna avhandling designar och implementerar vi ett system som kan hitta den optimala uppsättningen datafunktioner för att göra privata, med tanke på begränsningar av enhetsresurserna och systemets önskade prestanda eller noggrannhet. Med hjälp av generaliseringsteknikerna för data-anonymisering skapar vi användardefinierade injicerbara sekretess-kodningsfunktioner för att göra varje funktion i datasetet privat. Oavsett resurstillgänglighet definieras vissa datafunktioner av användaren som viktiga funktioner för att göra privat. Alla andra datafunktioner som kan utgöra ett integritetshot kallas de icke-väsentliga funktionerna. Vi föreslår Dynamic Iterative Greedy Search (DIGS), en girig sökalgoritm som tar resursförbrukningen för varje icke-väsentlig funktion som inmatning och ger den mest optimala uppsättningen icke-väsentliga funktioner som kan vara privata med tanke på tillgängliga resurser. Den mest optimala uppsättningen innehåller de funktioner som förbrukar minst resurser. Vi utvärderar vårt system på en Fitbit-dataset som innehåller 17 datafunktioner, varav 4 är viktiga privata funktioner för en viss klassificeringsapplikation. Våra resultat visar att vi kan erbjuda ytterligare 9 privata funktioner förutom de 4 viktiga funktionerna i Fitbit-datasetet som innehåller 1663 poster. Dessutom kan vi spara 26; 21% minne jämfört med att göra alla funktioner privata. Vi testar också vår metod på en större dataset som genereras med Generative Adversarial Network (GAN). Den valda kantenheten, Raspberry Pi, kan dock inte tillgodose storleken på den stora datasetet på grund av otillräckliga resurser. Våra utvärderingar med 1=8th av GAN-datasetet resulterar i 3 extra privata funktioner med upp till 62; 74% minnesbesparingar jämfört med alla privata datafunktioner. Att upprätthålla integritet kräver inte bara ytterligare resurser utan har också konsekvenser för de designade applikationernas prestanda. Vi upptäcker dock att integritetskodning har en positiv inverkan på noggrannheten i klassificeringsmodellen för vår valda klassificeringsapplikation.
APA, Harvard, Vancouver, ISO, and other styles
18

Percudani, Pietro, and Mohamad Batrawi. "The Impact of Internet of Things unification with Project Management Disciplines in project-based organizations." Thesis, Umeå universitet, Företagsekonomi, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-145428.

Full text
Abstract:
The greatest advantage of Information Technology (IT) is its ability in entitling personnel to achieve their goals. Allowing personnel to grasp knowledge and skills they weren’t aware of previously, rendering to a sense that it’s all about potential; as expressed by former CEO of Microsoft Steve Ballmer. Internet of Things (IoT) data, according to ORACLE (2017), provides insight from new data collected and provides solutions. Thus, allowing businesses to achieve new innovative services at a more efficient and productive manner while reducing the risk factors. Proving that the connections between the organisation and devices are securely connected, analysed, and integrated simultaneously with IoT data. Project Management (PM) the leading discipline in management that benefits enterprises through actual and operative management of change through its systematic approach of initiating, planning, executing, monitoring &amp; controlling, Testing &amp; Commissioning and finally Handing Over to the client the project; managing various types of projects with various drivers of change and uncertainty. (Sawyer, L. 2016). As significant as technology has become in our lives, this study aims in highlighting the importance of Internet of Things and the synergic implementation of Project Management disciplines in project-oriented organisations. It also explores the challenges, barriers, and benefits of IoT in synergy with PM disciplines. The paper also considered one of the most crucial elements of any organization or business, people, fixating on project managers and how the role of a project manager is affected in the innovative project oriented organizations.
APA, Harvard, Vancouver, ISO, and other styles
19

Kadariya, Dipesh. "kBot: Knowledge-Enabled Personalized Chatbot for Self-Management of Asthma in Pediatric Population." Wright State University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=wright1565944979193573.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Pazzi, Stefano. "Internet of Things ed implementazione del sistema di misurazione della performance in ambito produttivo. Il caso Amadori." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019.

Find full text
Abstract:
Nel progetto si fa riferimento a diverse tecnologie abilitanti del paradigma dell’Industria 4.0, in particolare all'Internet of Things e al Data Analytics. Viene inoltre trattato il tema del Performance Measurement Systems (PMS): il “collante” dell’intero lavoro, assume notevole importanza in un contesto innovativo. Si considera il caso del Gruppo Amadori. L’obiettivo del progetto è di implementare soluzioni innovative a supporto del processo di formatura e confezionamento per la produzione di articoli a peso fisso in un reparto pilota. Innanzitutto, nella prima fase del lavoro, è stato implementato un PMS che permetta di fornire supporto informativo monitorando l’efficienza in ambito produttivo sia in tempo reale che a consuntivo. Con inefficienza ci si riferisce principalmente alla differenza tra contenuto effettivo e massa nominale del prodotto preconfezionato (giveaway). Si fa quindi riferimento ai Key Performance Indicators (KPIs), alla definizione dei target e all'implementazione di un sistema di controllo interattivo. Per quanto riguarda l’analisi a consuntivo sono stati realizzati reports per rispondere alle diverse esigenze di due differenti funzioni aziendali: il Controllo di Gestione e la Produzione. Successivamente, nella seconda fase ci si focalizza su una linea pilota. Le macchine vengono integrate al fine di attuare un meccanismo di regolazione automatica del peso della macchina a monte in base ai dati rilevati dalla bilancia a valle. In particolare, si fa riferimento alla definizione dell’algoritmo di correzione e alle interfacce che consentono all'operatore di interagire con il tool implementato. Lo strumento, quindi, interviene in maniera prescrittiva sul sistema e rappresenta un’applicazione che si inserisce nel paradigma dell’Industria 4.0. Ciò che ci si aspetta di ottenere dallo sviluppo dell’intero progetto, in particolare dall'integrazione tra le macchine presenti nella linea pilota, è una diminuzione sostanziale del valore del giveaway.
APA, Harvard, Vancouver, ISO, and other styles
21

Pisapia, Claudio. "SAP HANA Energy Management System: creazione di un prototipo di business in Syskoplan Reply." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017.

Find full text
Abstract:
Il presente lavoro di tesi verterà sulla creazione di un’Energy Management System in grado di soddisfare le esigenze delle imprese sia lato manufacturing che utilities. Nel primo capitolo oltre ad un‘introduzione al tema dell’EMS, verrà descritta la figura dell’Energy manager e fatta una panoramica sulla parte normativa in merito, cioè la ISO 50001 e la direttiva 2012/27/UE. Nel secondo capitolo verrà rimarcata l’importanza dei dati, in particolare verranno descritti nel dettaglio i temi della Business Intelligence, i Big Data, l’Internet of Things e la Predictive Analytics. Nel terzo capitolo poi verrà descritto il software utilizzato per l’analisi dei dati e lo sviluppo dell’EMS, che nella fattispecie è SAP, un gestionale utilizzato nelle medie e grandi aziende e che gestisce praticamente qualunque area interna all’impresa. Di esso dopo una breve introduzione, ci si focalizzerà sui tools SAP per la gestione energetica e si descriveranno i moduli utilizzati lungo tutto il percorso di tesi. Nel quarto capitolo, infine, attraverso un ciclo PDCA (Plan-Do-Check-Act), descriveremo le diverse fasi del progetto di creazione dell’Energy Management System, che partirà nella fase di pianificazione con la definizione dei KPI e la ricerca delle query. In seguito, nella fase del “fare”, verranno definiti i collegamenti tra le informazioni contenute nelle diverse query e si passerà alla creazione delle tabelle e dei collegamenti tramite join delle stesse. Nella fase di monitoraggio, verranno attuate le analisi sui dati, in modo da comprendere dove risiedono le problematiche maggiori e come superarle o intervenire per correggerle. Infine nella fase di attuazione, i dati, le tabelle, le analisi e le soluzioni alle problematiche verranno riportate all’interno di un’applicazione web customizzata per la gestione energetica, in modo da rendere per il cliente di facile lettura e veloce il monitoraggio d’impianto e da indirizzarli su eventuali azioni correttive da apportare.
APA, Harvard, Vancouver, ISO, and other styles
22

Kropsu-Vehkaperä, H. (Hanna). "Enhancing understanding of company-wide product data management in ICT companies." Doctoral thesis, Oulun yliopisto, 2012. http://urn.fi/urn:isbn:9789514297984.

Full text
Abstract:
Abstract Data is becoming more critical success factor as business processes rely increasingly on information systems. Product data is required to produce, sell, deliver, and invoice a product in information systems. Traditionally, product data and product data management (PDM) studies have focused on product development and related activities, with less attention being paid to PDM in other lifecycle phases. The purpose of this doctoral dissertation is to clarify challenges and prerequisites for company-wide PDM. The study covers the entire product lifecycle and provides potential solutions for developing company-wide PDM and enhancing PDM understanding as a company-wide action. The study was realised by collecting and analysing data from those ICT companies that are seeking for better ways to manage a wide product-range, technologically complex products and comprehensive solutions by enhancing their data management practices. The empirical practitioner’s experiences and perceptions are seen to have increased the knowledge in company-wide PDM. This study adopted a case study approach and utilises interviews as the main data collection method. This study indicates that company managers have already realised that successful business operations require a higher-level understanding of products and related product data. In practice, however, several challenges hinder the ability to achieve the goal of higher-level business-driven PDM. These challenges include product harmonisation, PDM process development requirements and information systems development requirements. The results of this research indicate that product harmonisation is required to better support efficient product data management. Understanding the true nature of product data, that is combination of product master data and other general product data, and the content of product data from different stakeholder perspectives are prerequisites for functional company-wide PDM. Higher-level product decisions have a significant impact on product data management. Extensive product ranges require general guidelines in order to be manageable, especially as even single products are complex. The results of this study indicate that companies should follow a top-down approach when developing their PDM practices. The results also indicate that companies require a generic product structure in order to support unified product management. The main implication of this dissertation is the support it provides for managers in terms of developing true company-wide product data management practices<br>Tiivistelmä Tiedosta on tullut tärkeä liiketoiminnan menestystekijä liiketoimintaprosessien hyödyntäessä yhä vahvemmin tietojärjestelmiä. Tuotteisiin liittyvä tieto on olennaista, jotta tuote voidaan valmistaa, myydä, toimittaa ja laskuttaa. Tuotetietoa ja sen hallintaa on perinteisesti tarkastelu tuotekehityslähtöisesti kun tämä tutkimus pyrkii ymmärtämään tuotetiedon hallintaa kattaen myös edellä mainitut yrityksen toiminnot. Tämän tutkimuksen tavoitteena on tunnistaa haasteita ja perusedellytyksiä yrityksenlaajuisten tuotetiedonhallinnan käytäntöjen kehittämiseksi. Tuotetiedon hallinta yrityksen laajuisena toimintona vaatii ymmärrystä eri toimijoista, jotka käyttävät tuotetietoa; tiedon luonteesta sekä tiedon hyödyntämisestä eri prosesseissa. Tutkimus toteutettiin ICT yrityksissä, joissa tuotetiedon käytäntöjä tehostamalla haetaan keinoja hallita laajaa tuotteistoa, teknologisesti monimutkaisten tuotteita sekä kokonaisratkaisuja. Käytännön toimijoiden kokemukset ja käsitykset ovat ensiarvoisen tärkeitä lisätessä tietoa yrityksen laajuisesta tuotetiedonhallinnasta. Tutkimus toteutettiin tapaustutkimuksen menetelmin, joissa pääasiallisena tiedonkeruu menetelmänä hyödynnettiin haastatteluja. Tämä tutkimus osoittaa, että liiketoimintalähtöisen tuotetiedon hallinan kehittäminen on ajankohtaista yrityksissä. Tutkimuksessa tunnistetaan lukuisia haasteita, jotka ovat estäneet liiketoimintalähtöisen tuotetiedonhallinnan saavuttamisen. Näitä haasteita ovat: tuotteen harmonisointi yrityksen eri toiminnoissa, tuotetiedon hallinnan prosessien kehittämisen vaatimukset sekä tietojärjestelmien kehittämisen vaatimukset. Tutkimustulosten mukaan tuotteiston harmonisointi on yksi perusedellityksistä tehokkaalle tuotetiedon hallinnalle. Yrityksen kattava tuotetiedoen hallinta vaatii myös tuotetiedon todellisen luonteen ymmärtämistä, joka koostuu tuotteen master datasta sekä muusta tuotetiedosta. Lisäksi on olennaista ymmrättää tuotetiedon sisältö sen todellisten käyttäjien näkökulmasta käsin. Tämän tutkimuksen tulokset osoittavat myös, että tuotetiedon hallinnan kehittäminen pitäisi edetä ”top-down” eli ylhäältä-alas periaatteen mukaan. Tulokset myös viittaavat siihen, että geneerinen tuoterakenne tukee yhdenmukaisia tuotehallinta käytäntöjä. Nämä tulokset tarjoavat työssä esitettyjen kuvausten ja mallien ohella tukea tuotetiedon hallinnan käytäntöjen kehittämiseen yrityksen laajuisesti
APA, Harvard, Vancouver, ISO, and other styles
23

ZAZA, CLAUDIO. "ICT tools for data management and analysis to support decisional process oriented to sustainable agri-food chains." Doctoral thesis, Università di Foggia, 2018. http://hdl.handle.net/11369/369199.

Full text
Abstract:
Il settore agroalimentare sta affrontando delle sfide globali. La prima riguarda sfamare la popolazione mondiale che nel 2050, secondo le proiezioni delle Nazioni Unite, raggiungerà quota 9,3 miliardi di persone. La seconda sfida riguarda la richiesta da parte dei consumatori di prodotti ottenuti da filiere agroalimentari sempre più sostenibili, sicure e trasparenti. In particolare, l’Agricoltura sostenibile è una tecnica di gestione in grado di preservare la diversità biologica, la produttività, la capacità di rigenerazione, la vitalità e l’abilità alla funzione di un ecosistema agricolo, assicurandone, oggi e in futuro, le funzioni ecologiche, economiche e sociali a livello locale, nazionale ed globale, senza danneggiare altri ecosistemi. Quindi, per fronteggiare la sfida dell’agricoltura sostenibile, gli agricoltori devono aumentare la qualità e la quantità della produzione, riducendo l’impatto ambientale attraverso nuovi strumenti e nuove strategie di gestione. Questo lavoro analizza l’integrazione nel settore agroalimentare di alcune tecnologie e metodologie ICT per l’acquisizione, gestione e analisi dei dati, come la tecnologia RFID (Radio Frequency IDentification), i FMIS (Farm Management Information Systems), i DW (Data Warehouse) e l’approccio OLAP (On-Line Analytical Processing). Infine, l’adozione delle tecnologie ICT da parte di vere aziende è stata valutata attraverso un questionario. Al riguardo dell’adozione delle tecnologie RFID, questo lavoro analizza l’opportunità di trasferimento tecnologico relativo al monitoraggio e controllo dei prodotti agroalimentari tramite l’utilizzo di sensori innovativi, intelligenti e miniaturizzati. Le informazioni riguardanti lo stato del prodotto sono trasferite in tempo reale in wireless, come previsto dalla tecnologia RFID. In particolare, due soluzioni RFID sono state analizzate, evidenziando vantaggi e punti critici in confronto ai classici sistemi per assicurare la tracciabilità e la qualità dei prodotti agroalimentari. Quindi, questo lavoro analizza la possibilità di sviluppare una struttura che combina le tecnologie della Business Intelligence con i principi della Protezione Integrata (IPM) per aiutare gli agricoltori nel processo decisionale, andando a diminuire l’impatto ambientale ed aumentare la performance produttiva. L’IPM richiede di utilizzare simultaneamente diverse tecniche di protezione delle colture per il controllo dei parassiti e patogeni tramite un approccio ecologico ed economico. Il sistema di BI proposto è chiamato BI4IPM e combina l’approccio OLTP (On-Line Transaction Processing) con quello OLAP per verificare il rispetto dei disciplinari di produzione integrata. BI4IPM è stato testato con dati provenienti da vere aziende olivicole pugliesi. L’olivo è una delle principali colture a livello globale e la Puglia è la prima regione produttrice in Italia, con un gran numero di aziende che generano dati sull’IPM. Le strategie di protezione delle colture sono correlate alle condizioni climatiche, considerando la forte relazione tra clima, colture e parassiti. Quindi, in questo lavoro è presentato un nuovo e avanzato modello OLAP che integra il GSI (Growing Season Index), un modello fenologico, per comparare indirettamente le aziende agricole dal punto di vista climatico. Il sistema proposto permette di analizzare dati IPM di diverse aziende agricole che presentano le stesse condizioni fenologiche in un anno al fine di individuare best practices e di evidenziare e spiegare pratiche differenti adottate da aziende che lavorano in differenti condizioni climatiche. Infine, è stata effettuata un’indagine al fine di capire come le aziende agricole della Basilicata si raggruppano in funzione del livello di innovazione adottato. È stato utilizzato un questionario per domandare alle aziende se adottano strumenti ICT, ed eventualmente in quale processo produttivo o di management vengano usati. È stata quindi effettuata un’analisi cluster sui dati raccolti. I risultati mostrano che, usando il metodo di clustering k-means, appaiono due gruppi: gli innovatori e gli altri. Mentre, applicando la rappresentazione boxlot, si ottengono 3 gruppi: innovatori, utilizzatori precoci e ritardatari.<br>The Agri-Food sector is facing global challenges. The first issue concerns feeding a world population that in 2050, according to United Nations projections, will reach 9.3 billion people. The second challenge is the request by consumers for high quality products obtained by more sustainable, safely and clear agri-food chains. In particular, the Sustainable agriculture is a management strategy able to preserve the biological diversity, productivity, regeneration capacity, vitality and ability to function of an agricultural ecosystem, ensuring, today and in the future, significant ecological, economic and social functions at the local, national and global scales, without harming other ecosystems. Therefore, to face the challenge of the sustainable agriculture, farmers need to increase quality and quantity of the production, reducing the environmental impact through new management strategies and tools. This work explores the integration of several ICT technologies and methodologies in the agri-food sector for the data acquisition, management and analysis, such as RFID technology, Farm Management Information Systems (FMIS), Data Warehouse (DW) and On-Line Analytical Processing (OLAP). Finally, the adoption of the ICT technologies by real farms is evaluated through a survey. Regarding the adoption of the RFID technology, this work explores an opportunity for technology transfer related to the monitoring and control of agri-food products, based on the use of miniaturized, smart and innovative sensors. The information concerning to the state of the product is transferred in real time in a wireless way, according to the RFID technology. In particular, two technical solutions involving RFID are provided, highlighting the advantages and critical points referred to the normal system used to ensure the traceability and the quality of the agri-food products. Therefore, this work explores the possibility of developing a framework that combines business intelligence (BI) technologies with Integrated Pest Management (IPM) principles to support farmers in the decisional process, thereby decreasing environmental cost and improving production performance. The IPM requires the simultaneous use of different crop protection techniques to control pests through an ecological and economic approach. The proposed BI system is called BI4IPM, and it combines on-line transaction processing (OLTP) with OLAP to verify adherence to the IPM technical specifications. BI4IPM is tested with data from real Apulian olive crop farms. Olive tree is one of the most important crop at global scale and Apulia is the first olive-producing region in Italy, with a huge amount of farms that generate IPM data. The crop protection strategies are correlated to the climate conditions considering the very important relation among climate, crops and pests. Therefore, in this work is presented a new advanced OLAP model integrating the Growing Season Index (GSI), a phenology model, to compare indirectly the farms by a climatic point of view. The proposed system allows analysing IPM data of different farms having the same phenological conditions over a year to understand some best practices and to highlight and explain different practices adopted by farms working in different climatic conditions. Finally, a survey aimed at investigating how Lucania' farms cluster according to the level of innovation adopted was performed. It was used a questionnaire for asking if farms adopt ICTs tools and, in case, what type they involved in managing and/or production processes. It has been done a cluster analysis on collected data. Results show that, using k-means clustering method, appear two clusters: innovators, remaining groups. While, using boxplot representation, clustered three groups: innovators, early adopters and laggards.
APA, Harvard, Vancouver, ISO, and other styles
24

Mirabella, Julienne. "Sviluppo di un'applicazione iOS basata su Open Data in ambito culturale." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2017.

Find full text
Abstract:
Le nuove tecnologie permettono di creare servizi per rispondere alle domande e alle esigenze della popolazione. Molti dei dati necessari a rispondere a queste esigenze sono prodotti da organismi pubblici, persone ed organizzazioni che raccolgono una vasta gamma di dati diversi per svolgere i loro compiti. Gli Open Data sono un mezzo per rendere questi dati disponibili e riutilizzabili a chiunque. Nel progetto di tesi ho voluto mostrare un esempio di utilizzo degli Open Data nella implementazione di un’applicazione mobile iOS chiamata MuseumsBo,con l'obiettivo fornire agli utenti un facile accesso ai dati riguardati il contesto culturale della città di Bologna.
APA, Harvard, Vancouver, ISO, and other styles
25

Kosíková, Renáta. "Řešení informačních potřeb obchodního týmu v leasingové společnosti s využitím ICT nástrojů." Master's thesis, Vysoká škola ekonomická v Praze, 2014. http://www.nusl.cz/ntk/nusl-198475.

Full text
Abstract:
Nowadays information is considered as an essential company source, which is crucial for attaining a competitive advantage. The competitive advantage can be achieved by the acquisition of the relevant information, which is a prerequisite for knowledge creation. Acquired knowledge gives businesses an opportunity to improve their position against competitors quickly by finding the information required for the decision making. Hence the aim of this thesis is to examine the problem of acquisition of the relevant information, where this problem relates to the practice in a leasing company. In order to find the relevant information, the leasing company must precisely define its information needs so that it can obtain the relevant information more efficiently, in particular by using current information management tools. At present, information management is strongly affected by contemporary trends, such as Big Data, mobility, Cloud Computing, social media etc. As a consequence of these trends, volume of electronically processed information increases every day. Due to information oversaturation, information users may face the difficult information search problem. Using both modern information management tools and exactly defined information needs, the company shall obtain the relevant information in less time, or more information in the same time compared to the other firms. This might allow the company to make decisions more rapidly. In addition, information management tools are likely to uncover new contexts, of which the company was not aware.
APA, Harvard, Vancouver, ISO, and other styles
26

Ruiz, Gerard. "Distributed Data Management in Internet of Things Networking Environments : IOTA Tangle and Bitcoin Blockchain Distributed Ledger Technologies." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-77359.

Full text
Abstract:
Distributed ledger technology (DLT) is one of the latest in a long list of digital technologies, which appear to be heading towards a new industrial revolution. DLT has become very popular with the publication of the Bitcoin Blockchain in 2008. However, when we consider its suitability for dynamic networking environments, such as the Internet of Things, issues like transaction fees, scalability, and offline accessibility have not been resolved. The IOTA Foundation has designed the IOTA protocol, which is the data and value transfer layer for the Machine Economy. IOTA protocol uses an alternative blockless Blockchain which claims to solve the previous problems: the Tangle. This thesis first inquires into the theoretical concepts of both technologies Tangleand Blockchain, to understand them and identify the reasons to be compatible or not with the Internet of Things networking environments. After the analysis, the thesis focuses on the proposed implementation as a solution to address the connectivity issue suffered by the IOTA network. The answer to the problem is the development of a Neighbor Discovery algorithm, which has been designed to fulfill the requirements demanded by the IOTA application. Dealing with IOTA network setup can be very interesting for the community that is looking for new improvements at each release. Testing the solution in a peer-to-peer specific protocol (PeerSim), with different networking scenarios, allowed us to get valuable and more realistic information. Thus, after analyzing the results, we were able to determine the appropriate IOTA network configuration to build a more reliable and long-lasting network.
APA, Harvard, Vancouver, ISO, and other styles
27

Salamoun, Sioufi Randa. "Boundary management in ICT-enabled work : exploring structuration in information systems research." Thesis, University of Manchester, 2013. https://www.research.manchester.ac.uk/portal/en/theses/boundary-management-in-ictenabled-work-exploring-structuration-in-information-systems-research(a5110203-c9ba-4f92-85d3-4bb7b4e43d25).html.

Full text
Abstract:
ICTs have enabled increased mobility and created a new era of workplace connectivity. Due to changes in work organization, global operations, increased mobility, and the new opportunities they are creating; work requires more coordination, more travel and a higher frequency of boundary spanning. ICTs have infiltrated into the personal life of individuals, while similarly, having an increasing impact on how organizations manage their workers‘ work-life balance. This research investigates the work boundary negotiation process in ICT-enabled work.Using an in-depth case study supplemented with visual data, this thesis studies the case of Sigma, an international consulting firm, that serves clients located in a large geographical area. It explores how consultants exhibiting mobile work practices, use ICTs to negotiate work boundaries. It draws on the structurational model of technology and complements it with the boundary object construct. The utilisation of this combined approach allows further understanding of work boundary negotiation.The research reveals that some ICTs as technological artefacts are boundary objects bridging between different groups of actors, crossing work boundaries, and allowing actors to negotiate their work boundaries while challenging traditional boundaries. Thus, allowing consultants to use their ICTs (specifically their smartphone) to negotiate their work boundaries on a need to basis. The boundary negotiation process (as revealed by the structuration process) is the means by which consultants try to make the most out of existing social structures – in this case specifically domination – in their organizational context. The ICT becomes a source of power and is mainly used to manifest domination over available resources. Consultants use them to maintain control over their life, increase their legitimacy and convey that they are professional experts. ICTs allow consultants to continuously redefine their work boundaries which become dynamic, fluid and contextual; the research reaffirms the sociotechnical nature of work boundaries.The thesis also develops a conceptual model of work boundary negotiation that conceptually illustrates how boundary negotiation is the outcome of the structuration process and the negotiation of existing structures of domination, legitimation and signification.
APA, Harvard, Vancouver, ISO, and other styles
28

Cocchieri, Alessio. "Progettazione e sviluppo di un software di esportazione dati per applicazioni IoT di monitoraggio strutturale." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/24261/.

Full text
Abstract:
Il web oggi genera un’enorme mole di dati che necessita di essere gestita in maniera adeguata al fine di ottenere informazioni che abbiano valore e che supportino i processi di decision-making attivati all’interno delle organizzazioni. Nell’era in cui stiamo vivendo, qualsiasi oggetto, potenzialmente, può essere “intelligente” e quindi capace di raccogliere, generare e memorizzare dati all’interno di database. In questo contesto l’Internet of Things (IoT) riveste un ruolo determinante configurandosi come estensione della rete internet tradizionale in cui la connettività è estesa ad oggetti. Il presente lavoro approfondisce il tema dell’IoT applicato al monitoraggio strutturale, ossia il monitoraggio costante delle infrastrutture volto a prevenire eventuali danni e favorirne la conservazione nel lungo periodo. A tal fine, viene proposto un caso pratico di progettazione e sviluppo di un data export per applicazioni IoT di monitoraggio strutturale integrato nella piattaforma Modron, ossia una piattaforma Web of Things utilizzata nell’ambito del progetto INAIL 2018 MAC4PRO.
APA, Harvard, Vancouver, ISO, and other styles
29

ADAMASHVILI, NINO. "Big data analytics tools for improving the decision-making process in agrifood supply chain." Doctoral thesis, Università di Foggia, 2021. https://hdl.handle.net/11369/425167.

Full text
Abstract:
Introduzione: Nell'interesse di garantire una sicurezza alimentare a lungo termine di fronte a circostanze mutevoli, è necessario comprendere e considerare gli aspetti ambientali, sociali ed economici del processo di produzione. Inoltre, a causa della globalizzazione, sono stati sollevati i problemi delle lunghe filiere agroalimentari, l'asimmetria informativa, la contraffazione, la difficoltà di tracciare e rintracciare l'origine dei prodotti e le numerose questioni correlate quali il benessere dei consumatori e i costi sanitari. Le tecnologie emergenti guidano verso il raggiungimento di nuovi approcci socioeconomici in quanto consentono al governo e ai singoli produttori agricoli di raccogliere ed analizzare una quantità sempre crescente di dati ambientali, agronomici, logistici e danno la possibilità ai consumatori ed alle autorità di controllo della qualità di accedere a tutte le informazioni necessarie in breve tempo e facilmente. Obiettivo: L'oggetto della ricerca riguarda lo studio delle modalità di miglioramento del processo produttivo attraverso la riduzione dell'asimmetria informativa, rendendola disponibile alle parti interessate in un tempo ragionevole, analizzando i dati sui processi produttivi, considerando l'impatto ambientale della produzione in termini di ecologia, economia, sicurezza alimentare e qualità di cibo, costruendo delle opportunità per le parti interessate nel prendere decisioni informate, oltre che semplificare il controllo della qualità, della contraffazione e delle frodi. Pertanto, l'obiettivo di questo lavoro è quello di studiare le attuali catene di approvvigionamento, identificare le loro debolezze e necessità, analizzare le tecnologie emergenti, le loro caratteristiche e gli impatti sulle catene di approvvigionamento e fornire utili raccomandazioni all'industria, ai governi e ai policy maker.<br>Introduction: In the interest of ensuring long-term food security and safety in the face of changing circumstances, it is interesting and necessary to understand and to take into consideration the environmental, social and economic aspects of food and beverage production in relation to the consumers’ demand. Besides, due to the globalization, the problems of long supply chains, information asymmetry, counterfeiting, difficulty for tracing and tracking back the origin of the products and numerous related issues have been raised such as consumers’ well-being and healthcare costs. Emerging technologies drive to achieve new socio-economic approaches as they enable government and individual agricultural producers to collect and analyze an ever-increasing amount of environmental, agronomic, logistic data, and they give the possibility to the consumers and quality control authorities to get access to all necessary information in a short notice and easily. Aim: The object of the research essentially concerns the study of the ways for improving the production process through reducing the information asymmetry, making it available for interested parties in a reasonable time, analyzing the data about production processes considering the environmental impact of production in terms of ecology, economy, food safety and food quality and build the opportunity for stakeholders to make informed decisions, as well as simplifying the control of the quality, counterfeiting and fraud. Therefore, the aim of this work is to study current supply chains, to identify their weaknesses and necessities, to investigate the emerging technologies, their characteristics and the impacts on supply chains, and to provide with the useful recommendations the industry, governments and policymakers.
APA, Harvard, Vancouver, ISO, and other styles
30

Chauma, Crecencia Naison. "The application of project management tools and techniques in ICT SME projects in Western Cape." Thesis, Cape Peninsula University of Technology, 2017. http://hdl.handle.net/20.500.11838/2546.

Full text
Abstract:
Thesis (MTech (Business Administration))--Cape Peninsula University of Technology, 2017.<br>Introduction The research looked at the application of project management tools and techniques in ICT SME projects in the Western Cape. Problem Statement Previous literature revealed that small to medium enterprises (SMEs) are vital to developing economies as they provide employment and contribute to overall sustainable economic productivity. Literature further alludes that project management tools and techniques enhance SME’s ability to innovate, grow and compete in industry. However, it is unclear if ICT SMEs in Western Cape are applying project management tools and techniques in their projects. Therefore, this study aimed to investigate whether ICT SMEs in Western Cape applied these tools and techniques in their projects. Aims / Objectives The primary objective of the research was to determine the extent to which ICT SMEs in Western Cape were using project management tools and techniques. The secondary research objectives were: to determine the extent to which ICT SMEs in Western Cape knew about the project management tools and techniques and how to use them; to establish the extent to which the ICT SMEs knew the benefits of using project management tools and techniques; and to determine the extent to which project management tools and techniques were used by SMEs to achieve success. Methodology The research was non experimental. An electronic questionnaire was distributed using Survey Monkey and Mail Chimp to collect responses. Some questionnaires were hand delivered to ICT SMEs based in the Cape Town CBD areas accessible to the researcher. The results presented in the research were based on a survey of ICT SMEs, located in Western Cape. Out of the 341 responses targeted a total of 210 responses were obtained. The results obtained represent 60- 70% of the population interviewed. The sample was chosen using stratified random sampling that classified the respondents according to organisational hierarchy, and the amount per strata was noted. The respondents within each stratum were chosen using simple random sampling thus eliminating bias.
APA, Harvard, Vancouver, ISO, and other styles
31

Bazzali, Denis. "Un'applicazione mobile di guida turistica context-aware basata su linked open data." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/7446/.

Full text
Abstract:
In questo elaborato viene presentata Semantic City Guide, un'applicazione mobile di guida turistica basata su Linked Open Data. Si vogliono presentare i principali vantaggi e svantaggi derivati dall'interazione tra sviluppo nativo di applicazioni mobili e tecnologie del Semantic Web. Il tutto verrà contestualizzato esaminando alcuni progetti di aziende ed enti statali operativi nel settore turistico e dell'informatica.
APA, Harvard, Vancouver, ISO, and other styles
32

Kočíbová, Iveta. "Návrh migrace části ICT infrastruktury do datového centra." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2016. http://www.nusl.cz/ntk/nusl-241606.

Full text
Abstract:
This Master Thesis deals with the design of part of ICT infrastructure migration of two sites of the international company to the external data center. It analyzes the current status of company ICT, describes the virtualization infrastructure and its operational costs. Main part of the Thesis describes a possible solution of migration of virtualization infrastructure to external data center by using project management tools and methods. Finally, the Thesis summarizes project benefits for the company.
APA, Harvard, Vancouver, ISO, and other styles
33

Lundmark, Erik. "Organisational Adoption of Innovations : Management Practices and IT." Licentiate thesis, Linköping University, Linköping University, Department of Management and Engineering, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-11537.

Full text
Abstract:
<p>This thesis describes effects of use and reasons for using three different organisational innovations: ISO 9000, Information and Communication Technologies (ICT) and an administrative tool (the YAF-module) in the Swedish Sports Confederation’s system Swedish Sports Online. This is done through three separate studies. The first study is directed at Swedish Small and Medium Sized Enterprises (SME) and the two following studies are directed at Swedish sport associations. The thesis contains three separate essays presenting the studies and an introductory part where the studies are compared.</p><p>In the introductory part of the thesis the interaction patterns between organisation and innovation are compared and discussed. I discuss the level of effort put into the decision and implementation processes, and how this is related to the satisfaction with the innovations. The patterns that emerged are quite different in the three studies. Understanding these different interaction patterns between organisation and innovation is a step away from a beneficial/ detrimental dichotomy of innovations.</p><p>The decision and implementation processes differ between the three studies regarding what parts of the organisations were involved. In the first study we saw top down decision and implementation processes, whereas in the second study we saw bottom or middle up processes. In the third study the decision and implementation was much narrower in scope, often involving only one person. I also describe how all perspectives (efficient choice, forced selection, fad and fashion perspective) suggested by Abrahamson (1991), bear some grain of truth for the adoption of ISO 9000 by SMEs and adoption of ICT by sport associations, whereas imitation (the fad and fashion perspectives) is less important in the adoption of the YAF-module. Furthermore, I discuss the parallels between human and organisational decisionmaking.</p><p>Summary of the first essay – The aim of the first study is to investigate the effects of quality management in accordance with the ISO 9000 as viewed by both quality managers and other managers. We also consider the way companies carried out the recertification process to ISO 9001:2000 and what consequences different approaches brought. The study is based on Swedish SMEs with an ISO 9000:1994 who had recertified according to the ISO 9001:2000 standard. The strongest, most obvious and most valued effects of the ISO 9000 standard are clearer and more apparent working procedures and responsibilities. The most apparent problem is bureaucracy, which according to some managers can lead to reduced flexibility. The effects of the certification vary depending on how the certification project is conducted and how consultants are used.</p><p>Summary of the second essay – This essay presents a descriptive study of the use of information and communication technology (ICT) and the change in communication patterns in Swedish sport associations over the period 1994 to 2003. The change is discussed in light of Internet and broadband diffusion. Results show that new channels for communication have been adopted, primarily Web sites and e-mail, but few established channels have been dropped. While there are associations that save time and money and increase the spirit of community using ICT, many organisations experience the increased number of communication channels as a burden since maintaining them takes extra resources but the benefits are not always easy to detect or measure. Certain characteristics common among non profit organisations (NPOs) as well as Internet and broadband access have influenced the development of ICT use.</p><p>Summary of the third essay – This essay presents a new model for analysing adoption of discretionary, public information systems (PIS) with digital use patterns (such as use or non-use, as opposed to frequency of use, or degree of engaged or compliant use). The model is based on Rogers’ innovation diffusion theory (IDT) and Nilsson’s user centred access model (UCAM). The model is an alternative to the general technology acceptance model (TAM). The AKAM-Model identifies six prerequisites for use and four management approaches and describes how these are related. To illustrate its applicability, the AKAM-Model is used to analyse the adoption of a specific module, the YAF-module, in the Swedish Sports Confederation’s (SSC) system Swedish Sports Online. We present empirical results that indicate the frequency and importance of the barriers and driving forces as experienced by the YAF-module users and the potential YAF-module users.</p><br><p>Denna avhandling beskriver effekterna av, och skälen för, användning av tre organisatoriska innovationer: ISO 9000, informations- och kommunikationsteknologi (ICT) och en administrativ modul (LOK-stödsmodulen) i Riksidrottsförbundets system Svenskidrott Online. Avhandlingen presenterar tre olika studier samt en kappa där studierna diskuteras och jämförs. Den första riktar sig mot svenska små och medelstora företag, och de två följande studierna riktar sig mot svenska</p><p>idrottsföreningar.</p><p>I den inledande delen av avhandlingen diskuterar jag interaktionsmönstren mellan organisation och innovation och jämför mellan de olika studierna. Jag diskuterar hur mycket kraft som läggs på besluts- och implementeringsprocessen, och hur detta är relaterat till nöjdheten med innovationen. Mönstren som framträder är olika mellan de tre studierna. Att förstå dessa interaktionsmönster är ett steg bort från dikotomin förbättring/försämring rörande innovationer.</p><p>Besluts- och implementeringsprocesserna skiljer sig också mellan studierna avseende vilka delar av organisationen som är inblandade. Den första studien handlar om ”topdown” processer medan den andra studien handlar om ”bottom-up” eller ”mitten-upp”- processer. I den tredje studien var besluts- och implementeringsprocesserna betydligt mindre omfattande, ofta var endast en person inblandad. Jag beskriver också hur alla, av Abrahamson (1991) föreslagna perspektiven (efficient choice, forced selection, fad and fashion-perspektiven) har ett korn av sanning för adoption av ISO 9000 och för adoption av ICT medan imitation (fad and fashion-perspektiven) är mindre viktigt för adoption av LOK-stödsmodulen. Utöver detta diskuterar jag också tänkbara paralleller mellan individuellt och organisatoriskt beslutsfattande.</p><p>Sammanfattning av den första studien – Syftet med studien är att undersöka effekterna av kvalitetsledning i enlighet med ISO 9000, som de upplevs av både kvalitetschefer och andra chefer. Vi beaktar också hur företagen genomför omcertifieringsprocessen till ISO 9001:2000 och vilka konsekvenser olika genomföranden fick. Studien fokuserar på svenska små och medelstora företag med ett ISO 9000:1994 certifikat som senare omcertifierat sig enligt ISO 9001:2000. De starkaste, tydligaste och högst värderade effekterna av ISO 9000 är tydligheten i arbetssätt och ansvarsfördelning. Det största problemet är byråkrati som kan leda till minskad flexibilitet. Effekterna av certifiering varierar beroende på hur certifieringsprojektet genomfördes och hur konsulter används.</p><p>Sammanfattning av den andra studien – Denna studie är deskriptiv och fokuserar på hur ICT används och hur kommunikationsmönstren förändrats i svenska idrottsföreningar under perioden 1994 till 2003. Förändringen diskuteras i ljuset av utbredningen av Internet och bredbandsuppkoppling. Resultaten visar att idrottsföreningarna har börjat använda nya kommunikationskanaler, främst hemsida och epost, men ofta inte slutat använda traditionella kanaler. Det finns föreningar som har sparat både tid och pengar samt ökat gemenskapen genom att använda ICT. Många föreningar upplever dock de nya kanalerna som en börda, i de fall de inte slutat använda några traditionella kanaler. Vissa faktorer utmärkande för ideella organisationer och vissa faktorer utmärkande för Internet- och bredbandstillgång har påverkat ICT-användningen.</p><p>Sammanfattning av den tredje studien – I denna studie utvecklas en ny modell (AKAM-modellen) för att analysera adoption av valfria, publika informationssystem (PIS) med digitala användningsmönster (d.v.s. användning eller ingen användning till skillnad från grad av användning). Modellen baseras på Rogers innovations- och diffusionsteori (IDT) och Nilssons användarcentrerade tillgångsmodell (UCAM). Modellen är ett alternativ till teknologiacceptansmodellen (TAM). AKAM-modellen baseras på sex förutsättningar för användning och fyra sätt att hantera PIS samt beskriver hur dessa är relaterade till varandra. För att illustrera tillämpbarheten av AKAM-modellen, använder vi den för att analysera adoptionen av LOK-stödsmodulen i Riksidrottsförbundets system Svenskidrott Online. Vi presenterar empiriska resultat som ger en indikation om hur vanliga och hur viktiga olika barriärer och drivkrafter är, för användare och potentiella användare av LOK-stödsmodulen.</p><br>Report code: LIU-TEK-LIC-2008:10.
APA, Harvard, Vancouver, ISO, and other styles
34

Gonella, Philippe. "Business Process Management and Process Mining within a Real Business Environment: An Empirical Analysis of Event Logs Data in a Consulting Project." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amslaurea.unibo.it/11799/.

Full text
Abstract:
Il presente elaborato esplora l’attitudine delle organizzazioni nei confronti dei processi di business che le sostengono: dalla semi-assenza di struttura, all’organizzazione funzionale, fino all’avvento del Business Process Reengineering e del Business Process Management, nato come superamento dei limiti e delle problematiche del modello precedente. All’interno del ciclo di vita del BPM, trova spazio la metodologia del process mining, che permette un livello di analisi dei processi a partire dagli event data log, ossia dai dati di registrazione degli eventi, che fanno riferimento a tutte quelle attività supportate da un sistema informativo aziendale. Il process mining può essere visto come naturale ponte che collega le discipline del management basate sui processi (ma non data-driven) e i nuovi sviluppi della business intelligence, capaci di gestire e manipolare l’enorme mole di dati a disposizione delle aziende (ma che non sono process-driven). Nella tesi, i requisiti e le tecnologie che abilitano l’utilizzo della disciplina sono descritti, cosi come le tre tecniche che questa abilita: process discovery, conformance checking e process enhancement. Il process mining è stato utilizzato come strumento principale in un progetto di consulenza da HSPI S.p.A. per conto di un importante cliente italiano, fornitore di piattaforme e di soluzioni IT. Il progetto a cui ho preso parte, descritto all’interno dell’elaborato, ha come scopo quello di sostenere l’organizzazione nel suo piano di improvement delle prestazioni interne e ha permesso di verificare l’applicabilità e i limiti delle tecniche di process mining. Infine, nell’appendice finale, è presente un paper da me realizzato, che raccoglie tutte le applicazioni della disciplina in un contesto di business reale, traendo dati e informazioni da working papers, casi aziendali e da canali diretti. Per la sua validità e completezza, questo documento è stata pubblicato nel sito dell'IEEE Task Force on Process Mining.
APA, Harvard, Vancouver, ISO, and other styles
35

Mongardi, Michele. "Progettazione di un sistema DLT per l'interoperabilità e lo scambio di dati sanitari utilizzando le tecnologie IOTA e IPFS." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/20605/.

Full text
Abstract:
Le Distributed Ledger Technologies (DLTs) sono un nuovo tipo di tecnologia in cui i dati vengono salvati all'interno di un registro distribuito, gestito da nodi in una rete peer-to-peer tramite un protocollo basato sul consenso. Queste offrono un grande potenziale per rendere immutabile e sicuro lo scambio di informazioni di carattere sanitario. Lo scopo di questa tesi è di esplorare il potenziale di una specifica DLT, impiegata congiuntamente alla tecnologia IPFS, nel supportare la trasmissione di dati sanitari, sviluppando un programma per lo scambio automatizzato, inalterabile e protetto delle informazioni sulla salute dei pazienti. L'obiettivo perseguito è quello di ottenere una profilazione accurata dei pazienti raccogliendo periodicamente informazioni sullo stato di salute degli individui, in modo da poter fornire successivamente assistenza sanitaria di alta qualità grazie allo scambio efficace ed efficiente dei dati raccolti.
APA, Harvard, Vancouver, ISO, and other styles
36

Cavallin, Riccardo. "Approccio blockchain per la gestione dei dati personali." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/21604/.

Full text
Abstract:
L'elaborato presenta la tecnologia blockchain nelle sue funzionalità e nei suoi limiti. In particolare sono presentate le piattaforme Ethereum, RadixDLT e IOTA. Vengono discusse le implicazioni del regolamento GDPR nei confronti di questa tecnologia per la costruzione di Data Marketplace basato su Ethereum. Dopo aver presentato l'architettura di un marketplace si analizzano le prestazioni di diversi servizi di storage online per l'archiviazione di dati personali.
APA, Harvard, Vancouver, ISO, and other styles
37

Spiga, Beatrice. "Creazione di un sistema per la gestione di dati medicali con utilizzo di tecnologia blockchain." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2019.

Find full text
Abstract:
L’obiettivo di questa tesi e l’analisi, la progettazione e lo sviluppo di una web application riguardante il software gestionale delle visite mediche effettuate nei laboratori di un’azienda specializzata in genomica e medicina rigenerativa. Inoltre, nella tesi sono state introdotte le possibili applicazioni pratiche delle DLT (Distributed Ledger Technologies) applicate ai dati medicali provenienti dalla suddetta applicazione web, la quale è stata sviluppata durante l'esperienza di tirocinio curricolare. E' stato quindi descritto il valore che potrebbe avere l’utilizzo della blockchain nel settore sanitario. In particolare è stato discusso l’utilizzo del protocollo Masked Authenticated Messaging (MAM) per poter autenticare, crittografare e trasmettere attivamente i dati medicali su una rete IOTA dopo il loro prelevamento dall'applicazione web.
APA, Harvard, Vancouver, ISO, and other styles
38

Meza, De los Cobos Benjamín, and Monroy Ricardo Ortigoza. "The effectiveness of MRP II to integrate enterprise systems : Effektiviteten av MRP II för att integrera företagssystem." Thesis, Växjö University, School of Technology and Design, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:vxu:diva-959.

Full text
Abstract:
<p>The Small and Medium-size Enterprises are the key bone of the economy of many nations. The usual definition of SME’s, make up 99.8% of the approximately 19 million enterprises of the European Union (ISO, 2002) and Sweden is not an exception. That is why SME’s faces the strategic challenge of achieving sustained profitable growth. To meet this challenge, SME’s must develop capabilities to integrate their systems. Since ICT gives so many advantages to support the Supply Chain, and MRPII software’s have become a very popular tool in the last thirty years; Our purpose is to answer the following research question:</p><p>How can MRP II-type computer systems be used effectively to support the manufacturing and organizational integration?</p><p>The research started with on-site observations and interviews but the development of a model and a survey was needed. After, we linked the research with an already accepted model. The results show how important the human aspect and the accuracy are in the effective usage of an MRP II. It also demonstrates that the MRP II philosophy must be accepted to use the MRP II software profitably. Nevertheless, we modeled the vicious cycle that our case company might deal with, tried to find the root cause and give recommendations to break it.</p><br><p>Små och Mellan- stora företag är många nationers byggstenar. Den vanligaste definitionen av SME´s utgör 99,8% av de uppskattningsvis 19 miljoner företag inom Europeiska Unionen (ISO, 2002) och Sverige är inget undantag. Det är anledningen till att SME´s möter den strategiska utmaningen att uppnå ihållande och vinstgivande tillväxt. För att ta sig an denna utmaningen måste SME´s utveckla förmågor att integrera sina system. Eftersom ICT ger så många fördelar att stödja utbudskedjan, och eftersom MRP II mjukvaror har kommit att bli ett populärt verktyg de senaste trettio åren, kommer vårt syfte bli att svara på följande fråga:</p><p>Hur kan datasystem av typen MRP II användas för att på ett effektivt sätt underlätta integrering av tillverkning och organisation?</p><p>Undersökningen startade med observationer samt intervjuer på plats, men utförandet av en statistisk undersökning och en modell var nödvändig efter det att vi redan hade kopplat ihop arbetet med en redan accepterad modell. Reasultaten visar hur viktig den mänskliga faktorn samt exaktheten är i användandet av en MRP II. Den visar också att MRP II filosofin måste vara accepterad innan man kan använda MRP II mjukvaran på ett fördelaktigt sätt. Likväl har vi illustrerat den onda cirkeln som vårt företag måste försöka ta sig ur, försökt hitta den underliggande orsaken och slutligen att ge rekommendationer för att bryta den.</p>
APA, Harvard, Vancouver, ISO, and other styles
39

Zanni, Maria Angeliki. "Communication of sustainability information and assessment within BIM-enabled collaborative environment." Thesis, Loughborough University, 2017. https://dspace.lboro.ac.uk/2134/24680.

Full text
Abstract:
Sustainable performance of buildings has become a major concern among construction industry professionals. However, sustainability considerations are often treated as an add-on to building design, following ad hoc processes for their implementation. As a result, the most common problem to achieve a sustainable building outcome is the absence of the right information at the right time to make critical decisions. For design team members to appreciate the requirements of multidisciplinary collaboration, there is a need for transparency and a shared understanding of the process. The aim of this study is to investigate, model, and facilitate the early stages of Building Information Modelling (BIM) enabled Sustainable Building Design (SBD) by formalising the ad hoc working relationships of the best practices in order to standardise the optimal collaboration workflows. Thus, this research strives to improve BIM maturity level for SBD, assisting in the transition from ad hoc to defined , and then, to managed . For this purpose, this study has adopted an abductive research approach (iterative process of induction and deduction) for theory building and testing. Four (4) stages of data collection have been conducted, which have resulted in a total of 32 semi-structured interviews with industry experts from 17 organisations. Fourteen (14) best practice case studies have been identified, and 20 incidents narratives have been collected applying the Critical Decision Method (CMD) to examine roles and responsibilities, resources, information exchanges, interdependencies, timing and sequence of events, and critical decisions. As a result, the research has classified the critical components of SBD into a framework utilising content and thematic analyses. These have included the definition of roles and competencies that are essential for SBD along with the existing opportunities, challenges, and limitations. Then, Schedules of Services for SBD have been developed for the following stages of the RIBA Plan of Work 2013: stage 0 (Strategic Definition), stage 1 (Preparation and Brief), and stage 2 (Concept Design). The abovementioned SBD components have been coordinated explicitly into a systematic process, which follows Concurrent Engineering (CE) principles utilising Integrated DEFinition (IDEF) structured diagramming techniques (IDEF0 and IDEF3). The results have identified the key players roles and responsibilities, tasks (BIM Uses), BIM-based deliverables, and critical decision points for SBD. Furthermore, Green BIM Box (GBB) workflow management prototype tool has been developed to analyse communication and delivery of BIM-enabled SBD in a centralised system (Common Data Environment, CDE). GBB s system architecture for SBD process automation is demonstrated through Use Case Scenarios utilising the OMG UML (Object Management Group s Unified Modelling Language) notation. The proposed solution facilitates the implementation of BIM, Information Communication Technology (ICT), and Building Performance Analysis (BPA) software to realise the benefits of combining distributed teams expertise holistically into a common process. Finally, the research outcomes have been validated through academic and industrial reviews that have led to the refinement of the IDEF process model and framework. It has been found that collaborative patterns are repeatable for a variety of different non-domestic building types such as education, healthcare, and offices. Therefore, the research findings support the idea that a detailed process, which follows specified communication patterns, can assist in achieving sustainability targets efficiently in terms of time, cost, and effort.
APA, Harvard, Vancouver, ISO, and other styles
40

Krupa, Miroslav. "Metody technické prognostiky aplikovatelné v embedded systémech." Doctoral thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2012. http://www.nusl.cz/ntk/nusl-233568.

Full text
Abstract:
Hlavní cílem dizertace je poskytnutí uceleného pohledu na problematiku technické prognostiky, která nachází uplatnění v tzv. prediktivní údržbě založené na trvalém monitorování zařízení a odhadu úrovně degradace systému či jeho zbývající životnosti a to zejména v oblasti komplexních zařízení a strojů. V současnosti je technická diagnostika poměrně dobře zmapovaná a reálně nasazená na rozdíl od technické prognostiky, která je stále rozvíjejícím se oborem, který ovšem postrádá větší množství reálných aplikaci a navíc ne všechny metody jsou dostatečně přesné a aplikovatelné pro embedded systémy. Dizertační práce přináší přehled základních metod použitelných pro účely predikce zbývající užitné životnosti, jsou zde popsány metriky pomocí, kterých je možné jednotlivé přístupy porovnávat ať už z pohledu přesnosti, ale také i z pohledu výpočetní náročnosti. Jedno z dizertačních jader tvoří doporučení a postup pro výběr vhodné prognostické metody s ohledem na prognostická kritéria. Dalším dizertačním jádrem je představení tzv. částicového filtrovaní (particle filtering) vhodné pro model-based prognostiku s ověřením jejich implementace a porovnáním. Hlavní dizertační jádro reprezentuje případovou studii pro velmi aktuální téma prognostiky Li-Ion baterii s ohledem na trvalé monitorování. Případová studie demonstruje proces prognostiky založené na modelu a srovnává možné přístupy jednak pro odhad doby před vybitím baterie, ale také sleduje možné vlivy na degradaci baterie. Součástí práce je základní ověření modelu Li-Ion baterie a návrh prognostického procesu.
APA, Harvard, Vancouver, ISO, and other styles
41

Huang, Shih-Chien, and 黃詩茜. "IoT Device Management Platform with Data Visualization Design." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/8664fs.

Full text
Abstract:
碩士<br>國立東華大學<br>資訊工程學系<br>106<br>With the vigorous development of Internet of things (IoT), related applications gradually come in to the phase with the combination of Web Services. They further achieve a wide range of cloud application services using a unified data standard format, cloud computing and other cloud application services. The IoT device management platform also plays a vital role in this scenario. It undertakes a variety of applications and services on the Internet such as storing sensed data into the cloud, screening, sharing, analyzing data and other value-added application processing. This platform also supports the integration of multiple transmission protocols for the devices in physical layer. Therefore, sensed data can be seamlessly aggregated and integrated on this IoT platform using different transmission protocols. We design an online IoT platform that builds up with multiple network services and provides visualization of sensed data, unified heterogeneous protocols, anomaly data analysis, combination of social network, and so on. The proposed platform consists of three components: device management, data visualization, and data analysis. This thesis will focus on the integration of all components, message exchange process, and data visualization modules on the Web. Finally, we show the overall operational flows of three functions in the interface and the feeding back result of each component on the web page.
APA, Harvard, Vancouver, ISO, and other styles
42

Wang, Jiun-Cheng, and 王駿程. "Design and Implementation of IoT Vehicular Big-Data Management System." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/w55b69.

Full text
Abstract:
碩士<br>元智大學<br>電機工程學系<br>105<br>IoT is a currently popular research topic. As the number of devices grow, the amount of data increase dramatically. Efficiently processing and analyzing large amount of data become an important issue. In this study, we design a database system to manage and analyze vehicular rental data, which can present different aspects of analytical results with visualized approaches, including hot-zone map, time series and so on. To avoid redundant calculations and reduce computation time, we cached the pre-processed data in the system. The comparison shot that the process time is vastly reduced.
APA, Harvard, Vancouver, ISO, and other styles
43

Tu, Yi-Lin, and 杜易霖. "Dynamic Data Management Unit for IoT Routers with Heterogeneous Surrounding Sensors." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/uh8wmg.

Full text
Abstract:
碩士<br>國立交通大學<br>電子工程學系 電子研究所<br>104<br>Since the larger scale of IoT applications nowadays, the great number and high diversity of sensor data make the whole system difficult to collect and store these data. A memory system with high bandwidth and large storage is on necessary in IoT routers. Furthermore, the FIFO memory is at the first place of the IoT routers to access lots of sensor data that can have a great influence on the performance of the whole router. As the result, a dynamic data management unit is proposed as the FIFO memory to increase the performance. Besides, the low-power issue is also considered in our design to achieve the low-power budget in IoT routers. Our design is implemented via Synopsys Design Compiler based on TSMC 90nm technology at 50MHz. And the power consumption is simulated by using the power model. The latency of our design in low, medium and high injection load is 57.9%, 60.2% and 44.1% less than the distributed FIFO. The area of our design is 3.2% larger and the average power is 22.3% less than the distributed FIFO.
APA, Harvard, Vancouver, ISO, and other styles
44

Shih-HaoHuang and 黃詩豪. "A Fine-Grained GDPR-Compliant IoT Data Management Based on Revocable Blockchain Framework." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/6vabwd.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Iikubo, Azusa, and 飯窪梓. "The Design of Power Saving Control and Data Transmission Management for IoT Sensing Devices." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/55n9j2.

Full text
Abstract:
碩士<br>國立東華大學<br>資訊工程學系<br>105<br>Due to the advance of Internet of Things (IoT), the service and application related to IoT has been integrated into people's living space, and the users can know the information about the surrounding environment. Thanks to the progress of open hardware, users can build their own low-cost IoT sensing devices and deploy them in the user’s living environment. However, these devices are usually equipped with battery power, and it is necessary to reduce the energy consumption of the device to extend its lifetime. Without efficient data transmission and management methods on these devices, users are hard to acquire environmental information. Therefore, in this thesis, we design and implement IoT sensing devices that support sleep scheduling and data compression to reduce energy consumption, and design a system that can effectively transmit the data and manage the device. The designed components are integrated into an IoT management platform developed by our lab such that users can manage and control devices on the web, and see the latest environmental information immediately.
APA, Harvard, Vancouver, ISO, and other styles
46

Navarro, Emerson de Moraes. "Proposal of architecture for IoT solution for monitoring and management of plantations." Master's thesis, 2021. http://hdl.handle.net/10400.8/5588.

Full text
Abstract:
The world population growth is increasing the demand for food production. Furthermore, the reduction of the workforce in rural areas and the increase in production costs are challenges for food production nowadays. Smart farming is a farm management concept that may use Internet of Things (IoT) to overcome the current challenges of food production This work presents a systematic review of the existing literature on smart farming with IoT. The systematic review reveals an evolution in the way data are processed by IoT solutions in recent years. Traditional approaches mostly used data in a reactive manner. In contrast, recent approaches allowed the use of data to prevent crop problems and to improve the accuracy of crop diagnosis. Based on the finds of the systematic review, this work proposes an architecture of an IoT solution that enables monitoring and management of crops in real time. The proposed architecture allows the usage of big data and machine learning to process the collected data. A prototype is implemented to validate the operation of the proposed architecture and a security risk assessment of the implemented prototype is carried out. The implemented prototype successfully validates the proposed architecture. The architecture presented in this work allows the implementation of IoT solutions in different scenarios of farming, such as indoor and outdoor.
APA, Harvard, Vancouver, ISO, and other styles
47

Яєчник, Олександр Петрович, та Oleksandr Petrovych Yaiechnyk. "Дослідження протоколів взаємодії з IoT-пристроями при формуванні інформаційно-технологічних платформ". Master's thesis, 2020. http://elartu.tntu.edu.ua/handle/lib/34116.

Full text
Abstract:
Кваліфікаційна робота присв’ячена дослідженню протоколів взаємодії IoT-пристроїв та систем і формуванню інформаційно-технологічної платформи для управління IoT-пристроями. В першому розділі кваліфікаційної роботи розглянуто актуальність досліджень в галузі Інтернету речей. Висвітлено проблематику Інтернету речей. Проаналізовано сучасний стан досліджень в галузі. Описано IoT-екосистему. В другому розділі кваліфікаційної роботи розглянуто протоколи передачі даних для IoT-пристроїв та систем. Досліджено IoT-протоколи маршрутизації мережевого рівня. Описано протоколи інкапсуляції мережевого рівня IoT. Проаналізовано IoT-протоколи сеансового рівня. Висвітлено протоколи управління IoT-пристроями та системами. В третьому розділі кваліфікаційної роботи розглянуто опис сформованої на основі аналізу наукових джерел структури інформаційно-технологічної платформи для інтеграції та управління IoT-пристроями, спроектованої у вигляді девятирівневої моделі. Окремо розглянуто електробезпеку робочих місць користувачів комп’ютерів та описано організацію цивільного захисту на об’єктах промисловості.<br>The qualification work is devoted to the study of protocols of interaction of IoT-devices and systems and the formation of an information technology platform for the management of IoT-devices. The first section of the qualification work considers the relevance of research in the field of the Internet of Things. The problems of the Internet of Things are covered. The current state of research in the field is analyzed. The IoT ecosystem is described. The second section of the qualification work discusses data transfer protocols for IoT devices and systems. IoT-protocols of network layer routing are investigated. IoT network layer encapsulation protocols are described. Session layer IoT protocols are analyzed. IoT devices and systems control protocols are covered. The third section of the qualification work considers the description of the structure of the information technology platform for integration and control of IoT-devices formed on the basis of the analysis of scientific sources, designed in the form of a nine-level model. The electrical safety of computer users' workplaces is considered separately and the organization of civil protection at industrial facilities is described.<br>ВСТУП ...8 1 ІНТЕРНЕТ РЕЧЕЙ – СТАН ТА ПЕРСПЕКТИВИ ДОСЛІДЖЕНЬ ...10 1.1 Актуальність досліджень в галузі Інтернету речей ...10 1.2 Проблематика Інтернету речей ...12 1.3 Аналіз сучасного стану досліджень ...14 1.4 IoT-екосистема ...19 1.5 Висновки до першого розділу ...21 2 СТАНДАРТИ ТА ПРОТОКОЛИ ДЛЯ ІНТЕРНЕТУ РЕЧЕЙ ...22 2.1 Протоколи передачі даних для IoT-пристроїв та систем ...22 2.2 IoT-протоколи маршрутизації мережевого рівня ...29 2.3 Протоколи інкапсуляції мережевого рівня IoT ...31 2.4 IoT-протоколи сеансового рівня ...33 2.5 Протоколи управління IoT-пристроями та системами ...38 2.6 Висновок до другого розділу ...40 3 ІНФОРМАЦІЙНО-ТЕХНОЛОГІЧНОЇ ПЛАТФОРМА ТА БЕЗПЕКА IOT-СИСТЕМ ...42 3.1 Структура інформаційно-технологічної платформи для інтеграції та управління IoT-пристроями ...42 3.2 Протоколи та стандарти для убезпечення IoT-систем...52 3.3 Проєкти для підвищення рівня безпеки IoT-пристроїв ...56 3.4 Висновок до третього розділу ...58 4 ОХОРОНА ПРАЦІ ТА БЕЗПЕКА В НАДЗВИЧАЙНИХ СИТУАЦІЯХ ...59 4.1 Електробезпека робочих місць користувачів комп’ютерів...59 4.2 Організація цивільного захисту на об’єктах промисловості та виконання заходів щодо запобігання виникненню надзвичайних ситуацій техногенного походження ...62 ВИСНОВКИ ...65 ПЕРЕЛІК ДЖЕРЕЛ ...67 ДОДАТКИ
APA, Harvard, Vancouver, ISO, and other styles
48

Phengsuwan, J., T. Shah, R. Sun, P. James, Dhaval Thakker, and R. Ranjan. "An ontology-based system for discovering landslide-induced emergencies in electrical grid." 2019. http://hdl.handle.net/10454/17769.

Full text
Abstract:
No<br>Early warning systems (EWS) for electrical grid infrastructure have played a significant role in the efficient management of electricity supply in natural hazard prone areas. Modern EWS rely on scientific methods to analyze a variety of Earth Observation and ancillary data provided by multiple and heterogeneous data sources for the monitoring of electrical grid infrastructure. Furthermore, through cooperation, EWS for natural hazards contribute to monitoring by reporting hazard events that are associated with a particular electrical grid network. Additionally, sophisticated domain knowledge of natural hazards and electrical grid is also required to enable dynamic and timely decision‐making about the management of electrical grid infrastructure in serious hazards. In this paper, we propose a data integration and analytics system that enables an interaction between natural hazard EWS and electrical grid EWS to contribute to electrical grid network monitoring and support decision‐making for electrical grid infrastructure management. We prototype the system using landslides as an example natural hazard for the grid infrastructure monitoring. Essentially, the system consists of background knowledge about landslides as well as information about data sources to facilitate the process of data integration and analysis. Using the knowledge modeled, the prototype system can report the occurrence of landslides and suggest potential data sources for the electrical grid network monitoring.<br>FloodPrep, Grant/Award Number: (NE/P017134/1); LandSlip, Grant/Award Number: (NE/P000681/1)
APA, Harvard, Vancouver, ISO, and other styles
49

Soderi, Mirco. "Semantic models for the modeling and management of big data in a smart city environment." Doctoral thesis, 2021. http://hdl.handle.net/2158/1232245.

Full text
Abstract:
The overall purpose of this research has been the building or the improve- ment of semantic models for the representation of data related to smart cities and smart industries, in such a way that it could also be possible to build context-rich, user-oriented, ecient and eective applications based on such data. In some more detail, one of the key purposes has been the modelling of structural and the functioning aspects of the urban mobility and the produc- tion of instances exploiting the Open Street Map, that once integrated with trac sensors data, it has lead to the building and displaying of real-time trac reconstructions at a city level. One second key purpose has been the modelling of the Internet of Things, that allows today to seamlessy and e- ciently identify sensing devices that are deployed in a given area or along a given path and that are of a given type, and also inspect real-time data that they produce, through a user-oriented Web application, namely the Service Map. A pragmatic approach to the modelling has been followed, always tak- ing into consideration the best practices of semantic modelling on one side for that a clean, comprehensive and understandable model could result, and the reality of the data at our hands and of the applicative requirements on the other side. As said, the identication of architectures and methods that could grant eciency and scalability in data access has also been a primary purpose of this research that has led to the denition and implementation of a federation of Service Maps, namely the Super Service Map. The archi- tecture is fully distributed: each Super Service Map has a local list of the actual Service Maps with relevant metadata, it exposes the same interface as actual Service Maps, it forwards requests and builds merged responses, also implementing security and caching mechanisms. As said, the identica- tion of technologies, tools, methods, for presenting the data in a user-friendly manner is also has been a relevant part of this research, and it has led among the other to the denition and implementation of a client-server architecture and a Web interface in the Snap4City platform for the building, manage- ment, and displaying of synoptic templates and instances thanks to which users can securely display and iteract with dierent types of data. In end, some eort has been made for the automatic classication of RDF datasets as for their structures and purposes, based on the computation of metrics through SPARQL queries and on the application of dimensionality reduc- tion and clustering techniques. A Web portal is available where directories, datasets, metrics, and computations can be inspected even at real-time.
APA, Harvard, Vancouver, ISO, and other styles
50

Banotra, kanwal Dev. "Primary health care monitoring and evaluation system-an integrated data management model in Indian environment." Thesis, 1989. http://localhost:8080/iit/handle/2074/4576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography