To see the other types of publications on this topic, follow the link: Electronic spreadsheets. End-user computing.

Journal articles on the topic 'Electronic spreadsheets. End-user computing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Electronic spreadsheets. End-user computing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Panko, Raymond R., and Daniel N. Port. "End User Computing." Journal of Organizational and End User Computing 25, no. 3 (2013): 1–19. http://dx.doi.org/10.4018/joeuc.2013070101.

Full text
Abstract:
End user computing (EUC) is like dark matter in physics. EUC is enormous in quantity and importance yet has been largely invisible to corporate IT departments, information systems (IS) researchers, and corporate management. EUC applications, especially spreadsheet applications, are also “dark” in the sense that they pose a number of overlooked risks for organizations, including errors, privacy violations, trade secret extrusions, and compliance violations. On the positive side, EUC applications are also like the dark energy of physics. They are supporting critical gains in decision making, computing by scientists and engineers, operational systems, and other important processes in every corner of the firm. It is time to stop ignoring end user computing in general and spreadsheets in particular. The purpose of this paper is to document to the extent possible today then importance of end user computing relative to the concerns of corporate IT departments and IS researchers.
APA, Harvard, Vancouver, ISO, and other styles
2

Janvrin, Diane J. "Detecting Spreadsheet Errors: An Education Case." Issues in Accounting Education 23, no. 3 (2008): 435–54. http://dx.doi.org/10.2308/iace.2008.23.3.435.

Full text
Abstract:
In the past two decades, computing applications developed by non-system professionals have increased dramatically throughout the professional accounting and business community. Unfortunately, prior research indicates that many end-user computing applications such as spreadsheets contain difficult-to-detect errors. Relying on these error-prone applications, professionals may make inappropriate decisions that can negatively impact financial results. This case enhances students' understanding of end-user computing risk by illustrating the difficulty of detecting errors in spreadsheets. Furthermore, this case helps students improve their ability to detect spreadsheet errors.
APA, Harvard, Vancouver, ISO, and other styles
3

Obrenović, Željko, and Dragan Gašević. "End-User Service Computing: Spreadsheets as a Service Composition Tool." IEEE Transactions on Services Computing 1, no. 4 (2008): 229–42. http://dx.doi.org/10.1109/tsc.2008.16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

RAMAKRISHNAN, C. R., I. V. RAMAKRISHNAN, and DAVID S. WARREN. "XcelLog: a deductive spreadsheet system." Knowledge Engineering Review 22, no. 3 (2007): 269–79. http://dx.doi.org/10.1017/s026988890700118x.

Full text
Abstract:
AbstractThe promise of rule-based computing was to allow end-users to create, modify, and maintain applications without the need to engage programmers. But experience has shown that rule sets often interact in subtle ways, making them difficult to understand and reason about. This has impeded the widespread adoption of rule-based computing. This paper describes the design and implementation of XcelLog, a user-centered deductive spreadsheet system, to empower non-programmers to specify and manipulate rule-based systems. The driving idea underlying the system is to treat sets as the fundamental data type and rules as specifying relationships among sets, and use the spreadsheet metaphor to create and view the materialized sets. The fundamental feature that makes XcelLog suitable for non-programmers is that the user mainly sees the effect of the rules; when rules or basic facts change, the user sees the impact of the change immediately. This enables the user to gain confidence in the rules and their modification, and also experiment with what-if scenarios without any programming. Preliminary experience with using XcelLog indicates that it is indeed feasible to put the power of deductive spreadsheets for doing rule-based computing into the hands of end-users and do so without the requirement of programming or the constraints of canned application packages.
APA, Harvard, Vancouver, ISO, and other styles
5

Fachini, Ramon Faganello, Kleber Francisco Esposto, and Victor Claudio Bento Camargo. "A framework for development of advanced planning and scheduling (APS) systems in glass container industry." Journal of Manufacturing Technology Management 29, no. 3 (2018): 570–87. http://dx.doi.org/10.1108/jmtm-06-2017-0126.

Full text
Abstract:
Purpose The purpose of this paper is to present a new framework for designing and implementing simple but effective advanced planning and scheduling (APS) systems in glass container industry by means of spreadsheets. Design/methodology/approach The conceptual framework for APS system design is developed by integrating principles of end-user computing and mixed-integer programming (MIP). This framework comprises the APS implementation and its integration with enterprise resource planning systems. Such a proposal is applied in a real-world glass bottle manufacturer and its performance assessment is carried out by using metrics of the well-known supply chain operations reference (SCOR) model. Findings The case study shows that the framework application can improve the responsiveness of glass container supply chain (SC) as well as reduce significantly SC planning costs. A further assessment of the end-user computing satisfaction highlights the general acceptance of the framework. Practical implications The proposed framework can assist glass industry managers and practitioners to solve their complex production planning problem since custom solutions are not available for this sector. Another important implication is the possible generalization of the framework given that MIP models for production planning are currently available in the literature for different industries. Originality/value On the one hand, the new framework successfully integrates concepts of end-user computing and MIP bridging the gap between theory of aforementioned areas and their practical applications. On the other hand, the paper contributes to the scarce literature on performance evaluation of APS systems and their impact on production system performance. The novel APS gain analysis based on SCOR metrics ensures that the system potential benefits are aligned with the modern concepts of supply chain management.
APA, Harvard, Vancouver, ISO, and other styles
6

Sisson, Roger L. "Managing microcomputers and end-user computing: Some critical issues." Telematics and Informatics 2, no. 2 (1985): 133–40. http://dx.doi.org/10.1016/s0736-5853(85)80006-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

YOON, C. Y. "An Evaluation System for End-User Computing Capability in a Computing Business Environment." IEICE Transactions on Information and Systems E91-D, no. 11 (2008): 2607–15. http://dx.doi.org/10.1093/ietisy/e91-d.11.2607.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nedbaylov, A. A. "INFORMATION STRUCTURING FOR SOLVING TASKS IN SPREADSHEET ENVIRONMENT." Informatics and education, no. 2 (April 3, 2019): 42–46. http://dx.doi.org/10.32517/0234-0453-2019-34-2-42-46.

Full text
Abstract:
The calculations required in project activities for engineering students are commonly performed in electronic spreadsheets. Practice has shown that utilizing those calculations could prove to be quite difficult for students of other fields. One of the causes for such situation (as well as partly for problems observed during Java and C programming languages courses) lies in the lack of a streamlined distribution structure for both the source data and the end results. A solution could be found in utilizing a shared approach for information structuring in spreadsheet and software environment, called “the Book Method”, which takes into account the engineering psychology issues regarding the user friendliness of working with electronic information. This method can be applied at different levels in academic institutions and at teacher training courses.
APA, Harvard, Vancouver, ISO, and other styles
9

Dayal, Aveen, Naveen Paluru, Linga Reddy Cenkeramaddi, Soumya J., and Phaneendra K. Yalavarthy. "Design and Implementation of Deep Learning Based Contactless Authentication System Using Hand Gestures." Electronics 10, no. 2 (2021): 182. http://dx.doi.org/10.3390/electronics10020182.

Full text
Abstract:
Hand gestures based sign language digits have several contactless applications. Applications include communication for impaired people, such as elderly and disabled people, health-care applications, automotive user interfaces, and security and surveillance. This work presents the design and implementation of a complete end-to-end deep learning based edge computing system that can verify a user contactlessly using ‘authentication code’. The ‘authentication code’ is an ‘n’ digit numeric code and the digits are hand gestures of sign language digits. We propose a memory-efficient deep learning model to classify the hand gestures of the sign language digits. The proposed deep learning model is based on the bottleneck module which is inspired by the deep residual networks. The model achieves classification accuracy of 99.1% on the publicly available sign language digits dataset. The model is deployed on a Raspberry pi 4 Model B edge computing system to serve as an edge device for user verification. The edge computing system consists of two steps, it first takes input from the camera attached to it in real-time and stores it in the buffer. In the second step, the model classifies the digit with the inference rate of 280 ms, by taking the first image in the buffer as input.
APA, Harvard, Vancouver, ISO, and other styles
10

Alfiansyah, Gamasiano, Andar Sifa’il Fajeri, Maya Weka Santi, and Selvia Juwita Swari. "Evaluasi Kepuasan Pengguna Electronic Health Record (EHR) Menggunakan Metode EUCS (End User Computing Satisfaction) di Unit Rekam Medis Pusat RSUPN Dr. Cipto Mangunkusumo." Jurnal Penelitian Kesehatan "SUARA FORIKES" (Journal of Health Research "Forikes Voice") 11, no. 3 (2020): 258. http://dx.doi.org/10.33846/sf11307.

Full text
Abstract:
RSUPN Dr. Cipto Mangunkusumo is one of the hospitals whose services have used Electronic Health Record (EHR). The implementation of EHR is frequent loading and errors during service and lacking for several menus. The research purpose was to evaluate user satisfaction related to reporting on the Electronic Health Record (EHR) in the central medical records unit Dr. RSUPN. Cipto Mangunkusumo. This research was quantitative descriptive with population of all Electronic Health Record users in the central medical record unit, with 50 sample of respondents. The sampling technique was conducted by sistematic random sampling. Data was analyzed through scoring and presented in table form. The results showed that the dimension of accuracy was 73.28%, format was 71.6%, ease of use was 69.2%, content was 69.2 %, and timelines was 65.66%. These dimension scores indicated good criteria or the user was satisfied with the current Electronic Health Record (EHR) condition, but it requires the development of information systems by adding and adjusting modules contained in the EHR so that user satisfaction continues to increase.
 Keywords: evaluation; electronic health record (HER); end user computing satisfaction (EUCS) 
 
 ABSTRAK
 
 Rumah Sakit Umum Pusat Nasional (RSUPN) Dr. Cipto Mangunkusumo merupakan salah satu rumah sakit yang pelayanannya sudah menggunakan SIMRS yang disebut Electronic Health Record (EHR). Penggunaan EHR sering loading dan error pada saat pelayanan dan ada beberapa menu yang masih kurang. Tujuan penelitian ini adalah untuk mengevaluasi kepuasan pengguna terkait pelaporan pada Electronic Health Record (EHR) di unit rekam medis pusat RSUPN Dr. Cipto Mangunkusumo. Penelitian ini adalah kuantitatif deskriptif dengan populasi seluruh pengguna Electronic Health Record di unit rekam medis pusat, dan sampel berjumlah 50 responden. Teknik pengambilan sampel dilakukan dengan sistematic random sampling. Analisa data dilakukan melalui skoring dan disajika ndalam bentuk tabel. Hasil penelitian menunjukkan bahwa dimensi keakuratan memiliki nilai tertinggi, yaitu 73,28%, tampilan 71,6%, kemudahan pengguna 69,2%, isi 69,2%, dan waktu 65,66%. Skor dalam dimensi tersebut termasuk dalam kriteria baik atau pengguna puas terhadap konsisi Electronic Health Record (EHR) saat ini, namun masih diperlukan pengembangan sistem informasi serta menambahkan dan menyesuaikan modul yang ada di dalam EHR sehingga kepuasan pengguna terus meningkat.
 Kata kunci: evaluasi; electronic health record (HER); end user computing satisfaction (EUCS)
APA, Harvard, Vancouver, ISO, and other styles
11

Morkevicius, Nerijus, Algimantas Venčkauskas, Nerijus Šatkauskas, and Jevgenijus Toldinas. "Method for Dynamic Service Orchestration in Fog Computing." Electronics 10, no. 15 (2021): 1796. http://dx.doi.org/10.3390/electronics10151796.

Full text
Abstract:
Fog computing is meant to deal with the problems which cloud computing cannot solve alone. As the fog is closer to a user, it can improve some very important QoS characteristics, such as a latency and availability. One of the challenges in the fog architecture is heterogeneous constrained devices and the dynamic nature of the end devices, which requires a dynamic service orchestration to provide an efficient service placement inside the fog nodes. An optimization method is needed to ensure the required level of QoS while requiring minimal resources from fog and end devices, thus ensuring the longest lifecycle of the whole IoT system. A two-stage multi-objective optimization method to find the best placement of services among available fog nodes is presented in this paper. A Pareto set of non-dominated possible service distributions is found using the integer multi-objective particle swarm optimization method. Then, the analytical hierarchy process is used to choose the best service distribution according to the application-specific judgment matrix. An illustrative scenario with experimental results is presented to demonstrate characteristics of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
12

Ngowi, Lucas, Ellen Kalinga, and Nerey Mvungi. "Socio-Technical Perspective for Electronic Tax Information System in Tanzania." Tanzania Journal of Engineering and Technology 40, no. 1 (2021): 62–78. http://dx.doi.org/10.52339/tjet.v40i1.714.

Full text
Abstract:
Socio-technical systems theory has rarely been used by system architects in setting up computing systems. However, the role of socio-technical concepts in computing, which is becoming social in nature, has made the concepts more relevant and commercial. Tax information systems are examples of such systems because they are influenced by external variables such as the political environment, technological trends, and social environment, introducing complexity in their deployment and determining the type of e-services and their delivery to a diverse group of people. It was observed that in Tanzania there is resistance, reluctance and minimal use of electronic tax system because of insufficient end-user support and their involvement in constructing the system. Therefore, there is need to develop an electronic tax information system using socio-technical systems perspectives to ensure design of an efficient user-friendly tax administration system. The research used the qualitative approach, featuring case studies in Korea, Chile, Tanzania, and Denmark. The study used best practices from the Organization for Economic Cooperation and Development (OECD) to benchmark Tanzania Revenue Authority current practices. It was found that tax models implemented are techno-centric push models, which don’t attract its use by tax payers and requiring human intervention in its operation, hence not cost-effective. As the first and relevant phase in socio-technical system development, this paper presents the problem definition and analysis of e-Tax collection system in Tanzania.
APA, Harvard, Vancouver, ISO, and other styles
13

SAUTER, VICKI L., and LAURENCE A. MADEO. "AN EXPLORATORY ANALYSIS OF THE NEED FOR USER-ACQUAINTED DIAGNOSTIC SUPPORT SYSTEMS." International Journal of Information Technology & Decision Making 03, no. 03 (2004): 471–91. http://dx.doi.org/10.1142/s0219622004001148.

Full text
Abstract:
This paper explores intelligent diagnostic support systems as debugging tools for end-users of computing. By analyzing error (fault) behavior of users and fault-diagnostic relationships of these errors, the authors identified patterns that could be exploited to provide electronic diagnostic assistance. This analysis showed that (a) error behavior differs considerably across end-users; and (b) individual end-users tend to make the same errors over time because they have difficulty identifying the causes of their errors. When viewed in light of the literatures on human-computer interface design and human error/diagnostic behavior, this analysis lead to some general conclusions about how diagnostic systems could be designed to provide better advice. Specifically, the empirical results suggest that diagnostic systems with firing rules based solely upon the aggregated behavior of all users will often provide individual users with poor advice. In contrast, diagnostic support systems could be improved by using user-specific data in the knowledge base. Such a deviation from conventional ideas about knowledge-base development seems consistent with other diagnostic situations, such as medical and machine diagnosis.
APA, Harvard, Vancouver, ISO, and other styles
14

Cha, Hyun-Jong, Ho-Kyung Yang, and You-Jin Song. "A Study on the Design of Fog Computing Architecture Using Sensor Networks." Sensors 18, no. 11 (2018): 3633. http://dx.doi.org/10.3390/s18113633.

Full text
Abstract:
It is expected that the number of devices connecting to the Internet-of-Things (IoT) will increase geometrically in the future, with improvement of their functions. Such devices may create a huge amount of data to be processed in a limited time. Under the IoT environment, data management should play the role of an intermediate level between objects and devices that generate data and applications that access to the data for analysis and the provision of services. IoT interactively connects all communication devices and allows global access to the data generated by a device. Fog computing manages data and computation at the edge of the network near an end user and provides new types of applications and services, with low latency, high frequency bandwidth and geographical distribution. In this paper, we propose a fog computing architecture for efficiently and reliably delivering IoT data to the corresponding IoT applications while ensuring time sensitivity. Based on fog computing, the proposed architecture provides efficient power management in IoT device communication between sensors and secure management of data to be decrypted based on user attributes. The functional effectiveness and the safe data management of the method proposed are compared through experiments.
APA, Harvard, Vancouver, ISO, and other styles
15

Zhu, Yong, Chao Huang, Zhihui Hu, Abdullah Al-Dhelaan, and Mohammed Al-Dhelaan. "Blockchain-Enabled Access Management System for Edge Computing." Electronics 10, no. 9 (2021): 1000. http://dx.doi.org/10.3390/electronics10091000.

Full text
Abstract:
In the post-cloud era, edge computing is a new computing paradigm with data processed at the edge of the network, which can process the data close to the end-user in real time and offload the cloud task intelligently. Meanwhile, the decentralization, tamper-proof and anonymity of blockchain technology can provide a new trusted computing environment for edge computing. However, it does raise considerable concerns of security, privacy, fault-tolerance and so on. For example, identity authentication and access control rely on third parties, heterogeneous devices and different vendors in IoT, leading to security and privacy risks, etc. How to combine the advantages of the two has become the highlight of academic research, especially the issue of secure resource management. Comprehensive security and privacy involve all aspects of platform, data, application and access control. In. this paper, the architecture and behavior of an Access Management System (AMS) in a proof of concept (PoC) prototype are proposed with a Color Petri Net (CPN) model. The two domains of blockchain and edge computing are organically connected by interfaces and interactions. The simulation of operation, activity and role association proves the feasibility and effectiveness of the AMS. The instances of platform business access control, data access control, database services, IOT hub service are run on Advantech WISE-PaaS through User Account and Authentication (UAA). Finally, fine-grained and distributed access control can be realized with the help of a blockchain attribute. Namely, smart contracts are used to register, broadcast, and revoke access authorization, as well as to create specific transactions to define access control policies.
APA, Harvard, Vancouver, ISO, and other styles
16

C. da Silva, Rodrigo A., and Nelson L. S. da Fonseca. "On the Location of Fog Nodes in Fog-Cloud Infrastructures." Sensors 19, no. 11 (2019): 2445. http://dx.doi.org/10.3390/s19112445.

Full text
Abstract:
In the fog computing paradigm, fog nodes are placed on the network edge to meet end-user demands with low latency, providing the possibility of new applications. Although the role of the cloud remains unchanged, a new network infrastructure for fog nodes must be created. The design of such an infrastructure must consider user mobility, which causes variations in workload demand over time in different regions. Properly deciding on the location of fog nodes is important to reduce the costs associated with their deployment and maintenance. To meet these demands, this paper discusses the problem of locating fog nodes and proposes a solution which considers time-varying demands, with two classes of workload in terms of latency. The solution was modeled as a mixed-integer linear programming formulation with multiple criteria. An evaluation with real data showed that an improvement in end-user service can be obtained in conjunction with the minimization of the costs by deploying fewer servers in the infrastructure. Furthermore, results show that costs can be further reduced if a limited blocking of requests is tolerated.
APA, Harvard, Vancouver, ISO, and other styles
17

Purwandani, Indah. "Analisa Tingkat Kepuasan Pengguna Elearning Menggunakan EUCS dan Model Delone and McLean." Indonesian Journal on Software Engineering (IJSE) 4, no. 2 (2019): 99–106. http://dx.doi.org/10.31294/ijse.v4i2.5989.

Full text
Abstract:
Abstract - Elearning is now a necessity to support the implementation of learning in various educational institutions. End User Computing Satisfaction (EUCS) is a method used to measure the level of satisfaction of users of an application system by comparing expectations and reality of an information system. In this study 5 EUCS indicators were measured which would determine the level of Elearning user satisfaction, namely content, accuracy, shape, ease of use and timeliness. In the Delone And McLean Model there are six factors which are indicators, namely system quality, information quality, service quality, usage, user satisfaction, net benefits. Of the two methods will be measured how the level of elearning user satisfaction at Bina Sarana Informatics University. This study uses random samples collected through electronic questionnaires. There are several limitations in this study and further studies are needed so that the level of elearning user satisfaction can be described correctly.
 Keywords: elearning, eucs, endusercomputingsatisfaction, delonemclean
APA, Harvard, Vancouver, ISO, and other styles
18

Zhang, Jiawei, Ning Lu, Teng Li, and Jianfeng Ma. "Enabling Efficient Decentralized and Privacy Preserving Data Sharing in Mobile Cloud Computing." Wireless Communications and Mobile Computing 2021 (September 2, 2021): 1–15. http://dx.doi.org/10.1155/2021/8513869.

Full text
Abstract:
Mobile cloud computing (MCC) is embracing rapid development these days and able to provide data outsourcing and sharing services for cloud users with pervasively smart mobile devices. Although these services bring various conveniences, many security concerns such as illegally access and user privacy leakage are inflicted. Aiming to protect the security of cloud data sharing against unauthorized accesses, many studies have been conducted for fine-grained access control using ciphertext-policy attribute-based encryption (CP-ABE). However, a practical and secure data sharing scheme that simultaneously supports fine-grained access control, large university, key escrow free, and privacy protection in MCC with expressive access policy, high efficiency, verifiability, and exculpability on resource-limited mobile devices has not been fully explored yet. Therefore, we investigate the challenge and propose an Efficient and Multiauthority Large Universe Policy-Hiding Data Sharing (EMA-LUPHDS) scheme. In this scheme, we employ fully hidden policy to preserve the user privacy in access policy. To adapt to large scale and distributed MCC environment, we optimize multiauthority CP-ABE to be compatible with large attribute universe. Meanwhile, for the efficiency purpose, online/offline and verifiable outsourced decryption techniques with exculpability are leveraged in our scheme. In the end, we demonstrate the flexibility and high efficiency of our proposal for data sharing in MCC by extensive performance evaluation.
APA, Harvard, Vancouver, ISO, and other styles
19

Kirstein, T., G. Tröster, and P. Lukowicz. "Wearable Systems for Health Care Applications." Methods of Information in Medicine 43, no. 03 (2004): 232–38. http://dx.doi.org/10.1055/s-0038-1633863.

Full text
Abstract:
Summary Objectives: Wearable systems can be broadly defined as mobile electronic devices that can be unobtrusively embedded in the user’s outfit as part of the clothing or an accessory. In particular, unlike conventional mobile systems, they can be operational and accessed without or with very little hindrance to user activity. To this end they are able to model and recognize user activity, state, and the surrounding situation: a property, referred to as context sensitivity. Wearable systems range from micro sensors seamlessly integrated in textiles through consumer electronics embedded in fashionable clothes and computerized watches to belt worn PCs with a head mounted display. The wearable computing concept is part of a broader framework of ubiquitous computing that aims at invisibly enhancing our environment with smart electronic devices.The goal of the paper is to provide a broad overview of wearable technology and its implications for health related applications. Methods: We begin by summarizing the vision behind wearable computing. We then describe a framework for wearable computing architecture and the main technological aspects. Finally we show how specific properties of wearable systems can be used in different health related application domains. Results: Wearable computing is an emerging concept building upon the success of today’s mobile computing and communication devices. Due to rapid technological progress it is currently making a transition from a pure research stage to practical applications. Many of those applications are in health related domains, in particular, health monitoring, mobile treatment and nursing. Conclusions: Within the next couple of years wearable systems and more general ubiquitous computing will introduce profound changes and new application types to health related systems. In particular they will prove useful in improving the quality and reducing the cost of caring for the aging population.
APA, Harvard, Vancouver, ISO, and other styles
20

Hasson, Mushtaq, Ali A. Yassin, Abdulla J. Yassin, Abdullah Mohammed Rashid, Aqeel A. Yaseen, and Hamid Alasadi. "Password authentication scheme based on smart card and QR code." Indonesian Journal of Electrical Engineering and Computer Science 23, no. 1 (2021): 140. http://dx.doi.org/10.11591/ijeecs.v23.i1.pp140-149.

Full text
Abstract:
As a hopeful computing paradigm, cloud services are obtainable to end users based on pay-as-you-go service. Security is represented one of the vital issues for the extended adoption of cloud computing, with the object of accessing several cloud service providers, applications, and services by using anonymity features to authenticate the user. We present a good authentication scheme based on quick response (QR) code and smart card. Furthermore, our proposed scheme has several crucial merits such as key management, mutual authentication, one-time password, user anonymity, freely chosen password, secure password changes, and revocation by using QR code. The security of proposed scheme depends on crypto-hash function, QR-code validation, and smart card. Moreover, we view that our proposed scheme can resist numerous malicious attacks and are more appropriate for practical applications than other previous works. The proposed scheme has proved as a strong mutual authentication based on burrows-abadi-needham (BAN) logic and security analysis. Furthermore, our proposed scheme has good results compared with related work.
APA, Harvard, Vancouver, ISO, and other styles
21

Zabala, Alaitz, Raffaele Vitulli, and Xavier Pons. "Impact of CCSDS-IDC and JPEG 2000 Compression on Image Quality and Classification." Journal of Electrical and Computer Engineering 2012 (2012): 1–13. http://dx.doi.org/10.1155/2012/761067.

Full text
Abstract:
This study measures the impact of both on-board and user-side lossy image compression (CCSDS-IDC and JPEG 2000) on image quality and classification. The Sentinel-2 Image Performance Simulator was modified to include these compression algorithms in order to produce Sentinel-2 simulated images with on-board lossy compression. A multitemporal set of Landsat images was used for the user-side compression scenario in order to study a crop area. The performance of several compressors was evaluated by computing the Signal-to-Noise Ratio (SNR) of the compressed images. The overall accuracy of land-cover classifications of these images was also evaluated. The results show that on-board CCSDS performs better than JPEG 2000 in terms of compression fidelity, especially at lower compression ratios (from CR 2:1 up to CR 4:1, i.e., 8 to 4 bpppb). The effect of compression on land cover classification follows the same trends, but compression fidelity may not be enough to assess the impact of compression on end-user applications. If compression is applied by end-users, the results show that 3D-JPEG 2000 obtains higher compression fidelity than CCSDS and JPEG 2000 with other parameterizations. This is due to the high dynamic range of the images (representing reflectances*10000), which JPEG 2000 is able to exploit better.
APA, Harvard, Vancouver, ISO, and other styles
22

Li, Cheng, Li Yang, and Jianfeng Ma. "A Secure and Verifiable Outsourcing Scheme for Assisting Mobile Device Training Machine Learning Model." Wireless Communications and Mobile Computing 2020 (November 17, 2020): 1–16. http://dx.doi.org/10.1155/2020/8825623.

Full text
Abstract:
In smart applications such as smart medical equipment, more data needs to be processed and trained locally and near the local end to prevent privacy leaks. However, the storage and computing capabilities of smart devices are limited, so some computing tasks need to be outsourced; concurrently, the prevention of malicious nodes from accessing user data during outsourcing computing is required. Therefore, this paper proposes EVPP (efficient, verifiable, and privacy-preserving), which is a computing outsourcing scheme used in the training process of machine learning models. The edge nodes outsource the complex computing process to the edge service node. First, we conducted a certain amount of testing to confirm the parts that need to be outsourced. In this solution, the computationally intensive part of the model training process is outsourced. Meanwhile, a random encryption perturbation is performed on the outsourced training matrix, and verification factors are introduced to ensure the verifiability of the results. In addition, the system can generate verifiable evidence that can be generated to build a trust mechanism when a malicious service node is found. At the same time, this paper also discusses the application of the scheme in other algorithms in order to be better applied. Through the analysis of theoretical and experimental data, it can be shown that the scheme proposed in this paper can effectively use the computing power of the equipment.
APA, Harvard, Vancouver, ISO, and other styles
23

Liu, Zhen, Jiawei Zhang, Yanan Li, and Yuefeng Ji. "Hierarchical MEC Servers Deployment and User-MEC Server Association in C-RANs over WDM Ring Networks." Sensors 20, no. 5 (2020): 1282. http://dx.doi.org/10.3390/s20051282.

Full text
Abstract:
With the increasing number of Internet of Things (IoT) devices, a huge amount of latency-sensitive and computation-intensive IoT applications have been injected into the network. Deploying mobile edge computing (MEC) servers in cloud radio access network (C-RAN) is a promising candidate, which brings a number of critical IoT applications to the edge network, to reduce the heavy traffic load and the end-to-end latency. The MEC server’s deployment mechanism is highly related to the user allocation. Therefore, in this paper, we study hierarchical deployment of MEC servers and user allocation problem. We first formulate the problem as a mixed integer nonlinear programming (MINLP) model to minimize the deployment cost and average latency. In terms of the MINLP model, we then propose an enumeration algorithm and approximate algorithm based on the improved entropy weight and TOPSIS methods. Numerical results show that the proposed algorithms can reduce the total cost, and the approximate algorithm has lower total cost comparing the heaviest-location first and the latency-based algorithms.
APA, Harvard, Vancouver, ISO, and other styles
24

Mitsis, Giorgos, Eirini Eleni Tsiropoulou, and Symeon Papavassiliou. "Data Offloading in UAV-Assisted Multi-Access Edge Computing Systems: A Resource-Based Pricing and User Risk-Awareness Approach." Sensors 20, no. 8 (2020): 2434. http://dx.doi.org/10.3390/s20082434.

Full text
Abstract:
Unmanned Aerial Vehicle (UAV)-assisted Multi-access Edge Computing (MEC) systems have emerged recently as a flexible and dynamic computing environment, providing task offloading service to the users. In order for such a paradigm to be viable, the operator of a UAV-mounted MEC server should enjoy some form of profit by offering its computing capabilities to the end users. To deal with this issue in this paper, we apply a usage-based pricing policy for allowing the exploitation of the servers’ computing resources. The proposed pricing mechanism implicitly introduces a more social behavior to the users with respect to competing for the UAV-mounted MEC servers’ computation resources. In order to properly model the users’ risk-aware behavior within the overall data offloading decision-making process the principles of Prospect Theory are adopted, while the exploitation of the available computation resources is considered based on the theory of the Tragedy of the Commons. Initially, the user’s prospect-theoretic utility function is formulated by quantifying the user’s risk seeking and loss aversion behavior, while taking into account the pricing mechanism. Accordingly, the users’ pricing and risk-aware data offloading problem is formulated as a distributed maximization problem of each user’s expected prospect-theoretic utility function and addressed as a non-cooperative game among the users. The existence of a Pure Nash Equilibrium (PNE) for the formulated non-cooperative game is shown based on the theory of submodular games. An iterative and distributed algorithm is introduced which converges to the PNE, following the learning rule of the best response dynamics. The performance evaluation of the proposed approach is achieved via modeling and simulation, and detailed numerical results are presented highlighting its key operation features and benefits.
APA, Harvard, Vancouver, ISO, and other styles
25

Mamidisetti, Gowtham, and Ramesh Makala. "A Proposed Model for Trust Management: Insights from a Simulation Study in the Context of Cloud Computing." Journal of Computational and Theoretical Nanoscience 17, no. 7 (2020): 2983–88. http://dx.doi.org/10.1166/jctn.2020.9121.

Full text
Abstract:
In computing systems, one of the centric topics entails cloud computing. This dominance is attributed to the crucial role that the concept plays in the daily lives of individuals, especially in the wake of the increasing adoption of technology by individuals and organizations. Indeed, the motivation behind the establishment, adoption, and implementation of cloud computing has been attributed to the need to offer low-cost and quick consumer service provision, as well as data manipulation and storage. However, the cloud environment continues to face security threats, a trend that informs the need for further investigations and analyses that could provide room for new system improvements. The current simulation study presents a dynamic model for security management in a cloud computing environment, with the central parameter involving electronic trust. Imperatively the proposed study examines interactions between the data provider and the data owner, as well as the end user. Specifically, the proposed model is that which ensures that for authentication purposes and access permissions, there is a continuous update of trust values. From the results, the model is flexible relative to the provision of dynamic access control, a positive trend that points to its promising level of efficiency.
APA, Harvard, Vancouver, ISO, and other styles
26

Qin, Zhenquan, Zanping Cheng, Chuan Lin, Zhaoyi Lu, and Lei Wang. "Optimal Workload Allocation for Edge Computing Network Using Application Prediction." Wireless Communications and Mobile Computing 2021 (March 25, 2021): 1–13. http://dx.doi.org/10.1155/2021/5520455.

Full text
Abstract:
By deploying edge servers on the network edge, mobile edge computing network strengthens the real-time processing ability near the end devices and releases the huge load pressure of the core network. Considering the limited computing or storage resources on the edge server side, the workload allocation among edge servers for each Internet of Things (IoT) application affects the response time of the application’s requests. Hence, when the access devices of the edge server are deployed intensively, the workload allocation becomes a key factor affecting the quality of user experience (QoE). To solve this problem, this paper proposes an edge workload allocation scheme, which uses application prediction (AP) algorithm to minimize response delay. This problem has been proved to be a NP hard problem. First, in the application prediction model, long short-term memory (LSTM) method is proposed to predict the tasks of future access devices. Second, based on the prediction results, the edge workload allocation is divided into two subproblems to solve, which are the task assignment subproblem and the resource allocation subproblem. Using historical execution data, we can solve the problem in linear time. The simulation results show that the proposed AP algorithm can effectively reduce the response delay of the device and the average completion time of the task sequence and approach the theoretical optimal allocation results.
APA, Harvard, Vancouver, ISO, and other styles
27

Xu, Rongxu, Wenquan Jin, and Dohyeun Kim. "Microservice Security Agent Based On API Gateway in Edge Computing." Sensors 19, no. 22 (2019): 4905. http://dx.doi.org/10.3390/s19224905.

Full text
Abstract:
Internet of Things (IoT) devices are embedded with software, electronics, and sensors, and feature connectivity with constrained resources. They require the edge computing paradigm, with modular characteristics relying on microservices, to provide an extensible and lightweight computing framework at the edge of the network. Edge computing can relieve the burden of centralized cloud computing by performing certain operations, such as data storage and task computation, at the edge of the network. Despite the benefits of edge computing, it can lead to many challenges in terms of security and privacy issues. Thus, services that protect privacy and secure data are essential functions in edge computing. For example, the end user’s ownership and privacy information and control are separated, which can easily lead to data leakage, unauthorized data manipulation, and other data security concerns. Thus, the confidentiality and integrity of the data cannot be guaranteed and, so, more secure authentication and access mechanisms are required to ensure that the microservices are exposed only to authorized users. In this paper, we propose a microservice security agent to integrate the edge computing platform with the API gateway technology for presenting a secure authentication mechanism. The aim of this platform is to afford edge computing clients a practical application which provides user authentication and allows JSON Web Token (JWT)-based secure access to the services of edge computing. To integrate the edge computing platform with the API gateway, we implement a microservice security agent based on the open-source Kong in the EdgeX Foundry framework. Also to provide an easy-to-use approach with Kong, we implement REST APIs for generating new consumers, registering services, configuring access controls. Finally, the usability of the proposed approach is demonstrated by evaluating the round trip time (RTT). The results demonstrate the efficiency of the system and its suitability for real-world applications.
APA, Harvard, Vancouver, ISO, and other styles
28

Castillo-Cara, Manuel, Edgar Huaranga-Junco, Milner Quispe-Montesinos, Luis Orozco-Barbosa, and Enrique Arias Antúnez. "FROG: A Robust and Green Wireless Sensor Node for Fog Computing Platforms." Journal of Sensors 2018 (2018): 1–12. http://dx.doi.org/10.1155/2018/3406858.

Full text
Abstract:
Over the past few years, we have witnessed the widespread deployment of wireless sensor networks and distributed data management facilities: two main building blocks of the Internet of things (IoT) technology. Due to the spectacular increase on the demand for novel information services, the IoT-based infrastructures are more and more characterized by their geographical sparsity and increasing demands giving rise to the need of moving from a cloud to a fog model: a novel deployment paradigm characterized by the provisioning of elastic resources geographically located as close as possible to the end user. Despite the large number of wireless sensor networks already available in the market, there are still many issues to be addressed on the design and deployment of robust network platforms capable of meeting the demand and quality of fog-based systems. In this paper, we undertake the design and development of a wireless sensor node for fog computing platforms addressing two of the main issues towards the development and deployment of robust communication services, namely, energy consumption and network resilience provisioning. Our design is guided by examining the relevant macroarchitecture features and operational constraints to be faced by the network platform. We based our solution on the integration of network hardware platforms already available on the market supplemented by smart power management and network resilience mechanisms.
APA, Harvard, Vancouver, ISO, and other styles
29

Shameer, A. P., and A. C. Subhajini. "Quality of Service Aware Resource Allocation Using Hybrid Opposition-Based Learning-Artificial Bee Colony Algorithm." Journal of Computational and Theoretical Nanoscience 16, no. 2 (2019): 588–94. http://dx.doi.org/10.1166/jctn.2019.7775.

Full text
Abstract:
In the recent days the Cloud Computing is the top most challenging and upcoming topic and is more being used in corporate IT industry. The rapid growth of IT trade is strongly varying and is not sufficient enough to handle big data. In the recent trend is to transfer all types of data from desk top to huge data centers for computing. This model is quite different and services provided to the end user based on pay-as-you-use model to public and provide dynamic resource allocation for guaranteed services to the user. The proper allocation of resources is one of the major issues in cloud computing. The term resource management is the process of share out existing resources and helps to coordinate IT resources and application over the internet. The objective of the successful resource allocation is to minimizing the costs benefits for providers and achieving client happiness. One of the foremost issues in cloud is proper resource allocation. In this paper, hybrid OBL and ABC algorithm is used to find out reliable RA problems in cloud workflow. Usually RA is a troublesome optimization problem. In our work we aim to consider QoS requirements and issues. QOS parameter like execution time, response time, throughput, cost as taken in our algorithm and tries to allocate the resource in efficient manner. The algorithms were executed maximum 50 times with different VM and task. The proposed algorithm were compared with the existing PSO, GA and ABC and the Cloudsim simulation output prove that OABC algorithm performed improved against existing algorithms. The output shows better allocation to achieve QoS with customer satisfaction as well ensuring an optimum resources use.
APA, Harvard, Vancouver, ISO, and other styles
30

Li, Haifeng, Caihui Lan, Xingbing Fu, Caifen Wang, Fagen Li, and He Guo. "A Secure and Lightweight Fine-Grained Data Sharing Scheme for Mobile Cloud Computing." Sensors 20, no. 17 (2020): 4720. http://dx.doi.org/10.3390/s20174720.

Full text
Abstract:
With the explosion of various mobile devices and the tremendous advancement in cloud computing technology, mobile devices have been seamlessly integrated with the premium powerful cloud computing known as an innovation paradigm named Mobile Cloud Computing (MCC) to facilitate the mobile users in storing, computing and sharing their data with others. Meanwhile, Attribute Based Encryption (ABE) has been envisioned as one of the most promising cryptographic primitives for providing secure and flexible fine-grained “one to many” access control, particularly in large scale distributed system with unknown participators. However, most existing ABE schemes are not suitable for MCC because they involve expensive pairing operations which pose a formidable challenge for resource-constrained mobile devices, thus greatly delaying the widespread popularity of MCC. To this end, in this paper, we propose a secure and lightweight fine-grained data sharing scheme (SLFG-DSS) for a mobile cloud computing scenario to outsource the majority of time-consuming operations from the resource-constrained mobile devices to the resource-rich cloud servers. Different from the current schemes, our novel scheme can enjoy the following promising merits simultaneously: (1) Supporting verifiable outsourced decryption, i.e., the mobile user can ensure the validity of the transformed ciphertext returned from the cloud server; (2) resisting decryption key exposure, i.e., our proposed scheme can outsource decryption for intensive computing tasks during the decryption phase without revealing the user’s data or decryption key; (3) achieving a CCA security level; thus, our novel scheme can be applied to the scenarios with higher security level requirement. The concrete security proof and performance analysis illustrate that our novel scheme is proven secure and suitable for the mobile cloud computing environment.
APA, Harvard, Vancouver, ISO, and other styles
31

Avgeris, Marios, Dimitrios Spatharakis, Dimitrios Dechouniotis, Nikos Kalatzis, Ioanna Roussaki, and Symeon Papavassiliou. "Where There Is Fire There Is SMOKE: A Scalable Edge Computing Framework for Early Fire Detection." Sensors 19, no. 3 (2019): 639. http://dx.doi.org/10.3390/s19030639.

Full text
Abstract:
A Cyber-Physical Social System (CPSS) tightly integrates computer systems with the physical world and human activities. In this article, a three-level CPSS for early fire detection is presented to assist public authorities to promptly identify and act on emergency situations. At the bottom level, the system’s architecture involves IoT nodes enabled with sensing and forest monitoring capabilities. Additionally, in this level, the crowd sensing paradigm is exploited to aggregate environmental information collected by end user devices present in the area of interest. Since the IoT nodes suffer from limited computational energy resources, an Edge Computing Infrastructure, at the middle level, facilitates the offloaded data processing regarding possible fire incidents. At the top level, a decision-making service deployed on Cloud nodes integrates data from various sources, including users’ information on social media, and evaluates the situation criticality. In our work, a dynamic resource scaling mechanism for the Edge Computing Infrastructure is designed to address the demanding Quality of Service (QoS) requirements of this IoT-enabled time and mission critical application. The experimental results indicate that the vertical and horizontal scaling on the Edge Computing layer is beneficial for both the performance and the energy consumption of the IoT nodes.
APA, Harvard, Vancouver, ISO, and other styles
32

Erawantini, Feby, and Nugroho Setyo Wibowo. "Implementasi Rekam Medis Elektronik dengan Sistem Pendukung Keputusan Klinis." Jurnal Teknologi Informasi dan Terapan 6, no. 2 (2019): 75–78. http://dx.doi.org/10.25047/jtit.v6i2.115.

Full text
Abstract:
Abstract— the community has the right to get quality and affordable health services. One effort to realize quality health services is the electronic medical record. The RME application is expected to be integrated between health services and has a function as a clinical decision support system. The purpose of this study is to implement the RME application in the educational clinic and evaluate the user's satisfaction about the RME application has been implemented. The method used for RME implementation was parallel implementation and evaluation of user satisfaction by the end user computing satisfaction method. Implementation of the RME application was carried out for 2 months in the educational clinic, from September to October 2019. The study involved 2 doctors, 1 registration officer, 2 nurses and 1 pharmacy officer. The results of RME application research can be implemented in educational clinic, RME applications with clinical decision support system in the form of blood pressure, risk of Diabetes Mellitus, risk of stroke, interaction of drug content with a history of allergies, as well as abnormal warnings of laboratory and radiological examination results.
APA, Harvard, Vancouver, ISO, and other styles
33

Babu, L. R. Aravind, and J. Saravana Kumar. "A Novel Improved Grey Wolf Optimization Algorithm Based Resource Management Strategy for Big Data Systems." Journal of Computational and Theoretical Nanoscience 18, no. 4 (2021): 1227–32. http://dx.doi.org/10.1166/jctn.2021.9383.

Full text
Abstract:
Presently, big data is very popular, since it finds helpful in diverse domains like social media, E-commerce transactions, etc. Cloud computing offers services on demand, broader networking access, source collection, quick flexibility and calculated services. The cloud sources are usually different and the application necessities of the end user are rapidly changing from time to time. So, the resource management is the tedious process. At the same time, resource management and scheduling plays a vital part in cloud computing (CC) results, particularly while the environment is employed in the analysis of big data, and minimum predictable workload dynamically enters into the cloud. The identification of the optimal scheduling solutions with diverse variables in varying platform still remains a crucial problem. Under cloud platform, the scheduling techniques should be able to adapt the changes quickly and according to the input workload. In this paper, an improved grey wolf optimization (IGWO) algorithm with oppositional learning principle has been important to carry out the scheduling task in an effective way. The presented IGWO based scheduling algorithm achieves optimal cloud resource usage and offers effective solution over the compared methods in a significant way.
APA, Harvard, Vancouver, ISO, and other styles
34

Ruan, Jinjia, and Dongliang Xie. "Networked VR: State of the Art, Solutions, and Challenges." Electronics 10, no. 2 (2021): 166. http://dx.doi.org/10.3390/electronics10020166.

Full text
Abstract:
The networking of virtual reality applications will play an important role in the emerging global Internet of Things (IoT) framework and it is expected to provide the foundation of the expected 5G tactile Internet ecosystem. However, considerable challenges are ahead in terms of technological constraints and infrastructure costs. The raw data rate (5 Gbps–60 Gbps) required achieving an online immersive experience that is indistinguishable from real life vastly exceeds the capabilities of future broadband networks. Therefore, simply providing high bandwidth is insufficient in compensating for this difference, because the demands for scale and supply vary widely. This requires exploring holistic solutions that exceed the traditional network domain, and integrating virtual reality (VR) data capture, encoding, network, and user navigation. Emerging services are extremely inefficient in terms of mass use and data management, which significantly reduces the user experience, due to their heuristic design choices. Other key aspects must be considered, such as wireless operation, ultra-low latency, client/network access, system deployment, edge computing/cache, and end-to-end reliability. A vast number of high-quality works have been published in this area and they will be highlighted in this survey. In addition to a thorough summary of recent progress, we also present an outlook of future developments in the quality of immersive experience networks and unified data set measurement in VR video transmission, focusing on the expansion of VR applications, security issues, and business issues, which have not yet been addressed, and the technical challenges that have not yet been completely solved. We hope that this paper will help researchers and developers to gain a better understanding of the state of research and development in VR.
APA, Harvard, Vancouver, ISO, and other styles
35

Rabbani, Imran Mujaddid, Muhammad Aslam, Ana Maria Martinez Enriquez, and Zeeshan Qudeer. "Service Association Factor (SAF) for Cloud Service Selection and Recommendation." Information Technology And Control 49, no. 1 (2020): 113–26. http://dx.doi.org/10.5755/j01.itc.49.1.23251.

Full text
Abstract:
Cloud computing is one of the leading technology in IT and computer science domain. Business IT infrastructures are equipping themselves with modern regime of clouds. In the presence of several opportunities, selection criteria decision becomes vital when there is no supporting information available. Global clouds also need evaluation and assessment from its users that what they think about and how new ones could make their selection as per their needs. Recommended systems were built to propose best services using customer's feedback, applying quality of service parameters, assigning scores, trust worthiness and clustering in different forms and models. These techniques did not record and use interrelationships between the services that is true impact of service utilization. In the proposed approach, service association factor calculates value of interrelations among services used by the end user. An intelligent leaning based recommendation system is developed for assisting users to select services on their respective preferences. This technique is evaluated on leading service providers and results show that learning base system performs well on all types of cloud models.
APA, Harvard, Vancouver, ISO, and other styles
36

MA, Taylor, Bennett CL, Schoen MW, and Hoque S. "Advances in artificial neural networks as a disease prediction tool." Journal of Cancer Research & Therapy 9, no. 1 (2021): 1–11. http://dx.doi.org/10.14312/2052-4994.2021-1.

Full text
Abstract:
Throughout the last decade, utilization of machine learning has seen a sharp rise in fields such as computing, transportation, engineering, and medicine. Artificial neural networks (ANNs) have demonstrated increased application due to their versatility and ability to learn from large datasets. The emergence of electronic health records has propelled healthcare into an era of personalized medicine largely aided by computers. This review summarizes the current state of ANNs as a predictive tool in medicine and the downfalls of reliance on a self-adjusting computer network to make healthcare decisions. Medical ANN studies can be grouped into three categories - diagnosis, classification, and prediction, with diagnostic studies currently dominating the field. However, recent trends show prediction studies may soon outnumber the remaining categories. ANN prediction studies dominate in fields such as cardiovascular disease, neurologic disease, and osteoporosis. Neural networks consistently show higher predictive accuracy than industry standards. But several pitfalls are preventing mainstream adoption. Clinicians often rely on situational pearls to make complex healthcare decisions, ANNs often do not account for intuitive variables during their analysis. Instead, ANNs rely on incomplete patient data and ‘black box’ computing to make decisions that are not completely transparent to the end-user. This has led to ‘runaway’ networks that may ultimately make inaccurate and harmful decisions. This review emphasizes the extensive potential of machine learning in medicine and the obstacles that must be overcome to utilize its full potential.
APA, Harvard, Vancouver, ISO, and other styles
37

Real and Araujo. "Navigation Systems for the Blind and Visually Impaired: Past Work, Challenges, and Open Problems." Sensors 19, no. 15 (2019): 3404. http://dx.doi.org/10.3390/s19153404.

Full text
Abstract:
Over the last decades, the development of navigation devices capable of guiding the blind through indoor and/or outdoor scenarios has remained a challenge. In this context, this paper’s objective is to provide an updated, holistic view of this research, in order to enable developers to exploit the different aspects of its multidisciplinary nature. To that end, previous solutions will be briefly described and analyzed from a historical perspective, from the first “Electronic Travel Aids” and early research on sensory substitution or indoor/outdoor positioning, to recent systems based on artificial vision. Thereafter, user-centered design fundamentals are addressed, including the main points of criticism of previous approaches. Finally, several technological achievements are highlighted as they could underpin future feasible designs. In line with this, smartphones and wearables with built-in cameras will then be indicated as potentially feasible options with which to support state-of-art computer vision solutions, thus allowing for both the positioning and monitoring of the user’s surrounding area. These functionalities could then be further boosted by means of remote resources, leading to cloud computing schemas or even remote sensing via urban infrastructure.
APA, Harvard, Vancouver, ISO, and other styles
38

Singh, Hardeep, and Dean Sittig. "A Socio-technical Approach to Preventing, Mitigating, and Recovering from Ransomware Attacks." Applied Clinical Informatics 07, no. 02 (2016): 624–32. http://dx.doi.org/10.4338/aci-2016-04-soa-0064.

Full text
Abstract:
SummaryRecently there have been several high-profile ransomware attacks involving hospitals around the world. Ransomware is intended to damage or disable a user’s computer unless the user makes a payment. Once the attack has been launched, users have three options: 1) try to restore their data from backup; 2) pay the ransom; or 3) lose their data. In this manuscript, we discuss a socio-technical approach to address ransomware and outline four overarching steps that organizations can undertake to secure an electronic health record (EHR) system and the underlying computing infrastructure. First, health IT professionals need to ensure adequate system protection by correctly installing and configuring computers and networks that connect them. Next, the health care organizations need to ensure more reliable system defense by implementing user-focused strategies, including simulation and training on correct and complete use of computers and network applications. Concomitantly, the organization needs to monitor computer and application use continuously in an effort to detect suspicious activities and identify and address security problems before they cause harm. Finally, organizations need to respond adequately to and recover quickly from ransomware attacks and take actions to prevent them in future. We also elaborate on recommendations from other authoritative sources, including the National Institute of Standards and Technology (NIST). Similar to approaches to address other complex socio-technical health IT challenges, the responsibility of preventing, mitigating, and recovering from these attacks is shared between health IT professionals and end-users. Citation: Sittig DF, Singh H. A socio-technical approach to preventing, mitigating, and recovering from ransomware attacks.
APA, Harvard, Vancouver, ISO, and other styles
39

B D, Deebak, Fadi Al-Turjman, and Leonardo Mostarda. "A Hash-Based RFID Authentication Mechanism for Context-Aware Management in IoT-Based Multimedia Systems." Sensors 19, no. 18 (2019): 3821. http://dx.doi.org/10.3390/s19183821.

Full text
Abstract:
With the technological advances in the areas of Machine-To-Machine (M2M) and Device-To-Device (D2D) communication, various smart computing devices now integrate a set of multimedia sensors such as accelerometers, barometers, cameras, fingerprint sensors, gestures, iris scanners, etc., to infer the environmental status. These devices are generally identified using radio-frequency identification (RFID) to transfer the collected data to other local or remote objects over a geographical location. To enable automatic data collection and transition, a valid RFID embedded object is highly recommended. It is used to authorize the devices at various communication phases. In smart application devices, RFID-based authentication is enabled to provide short-range operation. On the other hand, it does not require the communication device to be in line-of-sight to gain server access like bar-code systems. However, in existing authentication schemes, an adversary may capture private user data to create a forgery problem. Also, another issue is the high computation cost. Thus, several studies have addressed the usage of context-aware authentication schemes for multimedia device management systems. The security objective is to determine the user authenticity in order to withhold the eavesdropping and tracing. Lately, RFID has played a significant for the context-aware sensor management systems (CASMS) as it can reduce the complexity of the sensor systems, it can be available in access control, sensor monitoring, real time inventory and security-aware management systems. Lately, this technology has opened up its wings for CASMS, where the challenging issues are tag-anonymity, mutual authentication and untraceability. Thus, this paper proposes a secure hash-based RFID mechanism for CASMS. This proposed protocol is based on the hash operation with the synchronized secret session-key to withstand any attacks, such as desynchronization, replay and man-in-the-middle. Importantly, the security and performance analysis proves that the proposed hash-based protocol achieves better security and performance efficiencies than other related schemes. From the simulation results, it is observed that the proposed scheme is secure, robust and less expensive while achieving better communication metrics such as packet delivery ratio, end-to-end delay and throughput rate.
APA, Harvard, Vancouver, ISO, and other styles
40

Borejko, Tomasz, Krzysztof Marcinek, Krzysztof Siwiec, et al. "NaviSoC: High-Accuracy Low-Power GNSS SoC with an Integrated Application Processor." Sensors 20, no. 4 (2020): 1069. http://dx.doi.org/10.3390/s20041069.

Full text
Abstract:
A dual-frequency all-in-one Global Navigation Satellite System (GNSS) receiver with a multi-core 32-bit RISC (reduced instruction set computing) application processor was integrated and manufactured as a System-on-Chip (SoC) in a 110 nm CMOS (complementary metal-oxide semiconductor) process. The GNSS RF (radio frequency) front-end with baseband navigation engine is able to receive, simultaneously, Galileo (European Global Satellite Navigation System) E1/E5ab, GPS (US Global Positioning System) L1/L1C/L5, BeiDou (Chinese Navigation Satellite System) B1/B2, GLONASS (GLObal NAvigation Satellite System of Russian Government) L1/L3/L5, QZSS (Quasi-Zenith Satellite System development by the Japanese government) L1/L5 and IRNSS (Indian Regional Navigation Satellite System) L5, as well as all SBAS (Satellite Based Augmentation System) signals. The ability of the GNSS to detect such a broad range of signals allows for high-accuracy positioning. The whole SoC (system-on-chip), which is connected to a small passive antenna, provides precise position, velocity and time or raw GNSS data for hybridization with the IMU (inertial measurement unit) without the need for an external application processor. Additionally, user application can be executed directly in the SoC. It works in the −40 to +105 °C temperature range with a 1.5 V supply. The assembled test-chip takes 100 pins in a QFN (quad-flat no-leads) package and needs only a quartz crystal for the on-chip reference clock driver and optional SAW (surface acoustic wave) filters. The radio performance for both wideband (52 MHz) channels centered at L1/E1 and L5/E5 is NF = 2.3 dB, G = 131 dB, with 121 dBc/Hz of phase noise @ 1 MHz offset from the carrier, consumes 35 mW and occupies a 4.5 mm2 silicon area. The SoC reported in the paper is the first ever dual-frequency single-chip GNSS receiver equipped with a multi-core application microcontroller integrated with embedded flash memory for the user application program.
APA, Harvard, Vancouver, ISO, and other styles
41

Akasiadis, Charilaos, Vassilis Pitsilis, and Constantine D. Spyropoulos. "A Multi-Protocol IoT Platform Based on Open-Source Frameworks." Sensors 19, no. 19 (2019): 4217. http://dx.doi.org/10.3390/s19194217.

Full text
Abstract:
Internet of Things (IoT) technologies have evolved rapidly during the last decade, and many architecture types have been proposed for distributed and interconnected systems. However, most systems are implemented following fragmented approaches for specific application domains, introducing difficulties in providing unified solutions. However, the unification of solutions is an important feature from an IoT perspective. In this paper, we present an IoT platform that supports multiple application layer communication protocols (Representational State Transfer (REST)/HyperText Transfer Protocol (HTTP), Message Queuing Telemetry Transport (MQTT), Advanced Message Queuing Protocol (AMQP), Constrained Application Protocol (CoAP), and Websockets) and that is composed of open-source frameworks (RabbitMQ, Ponte, OM2M, and RDF4J). We have explored a back-end system that interoperates with the various frameworks and offers a single approach for user-access control on IoT data streams and micro-services. The proposed platform is evaluated using its containerized version, being easily deployable on the vast majority of modern computing infrastructures. Its design promotes service reusability and follows a marketplace architecture, so that the creation of interoperable IoT ecosystems with active contributors is enabled. All the platform’s features are analyzed, and we discuss the results of experiments, with the multiple communication protocols being tested when used interchangeably for transferring data. Developing unified solutions using such a platform is of interest to users and developers as they can test and evaluate local instances or even complex applications composed of their own IoT resources before releasing a production version to the marketplace.
APA, Harvard, Vancouver, ISO, and other styles
42

Wright, Adam, Angela Ai, Joan Ash, et al. "Clinical decision support alert malfunctions: analysis and empirically derived taxonomy." Journal of the American Medical Informatics Association 25, no. 5 (2017): 496–506. http://dx.doi.org/10.1093/jamia/ocx106.

Full text
Abstract:
Abstract Objective To develop an empirically derived taxonomy of clinical decision support (CDS) alert malfunctions. Materials and Methods We identified CDS alert malfunctions using a mix of qualitative and quantitative methods: (1) site visits with interviews of chief medical informatics officers, CDS developers, clinical leaders, and CDS end users; (2) surveys of chief medical informatics officers; (3) analysis of CDS firing rates; and (4) analysis of CDS overrides. We used a multi-round, manual, iterative card sort to develop a multi-axial, empirically derived taxonomy of CDS malfunctions. Results We analyzed 68 CDS alert malfunction cases from 14 sites across the United States with diverse electronic health record systems. Four primary axes emerged: the cause of the malfunction, its mode of discovery, when it began, and how it affected rule firing. Build errors, conceptualization errors, and the introduction of new concepts or terms were the most frequent causes. User reports were the predominant mode of discovery. Many malfunctions within our database caused rules to fire for patients for whom they should not have (false positives), but the reverse (false negatives) was also common. Discussion Across organizations and electronic health record systems, similar malfunction patterns recurred. Challenges included updates to code sets and values, software issues at the time of system upgrades, difficulties with migration of CDS content between computing environments, and the challenge of correctly conceptualizing and building CDS. Conclusion CDS alert malfunctions are frequent. The empirically derived taxonomy formalizes the common recurring issues that cause these malfunctions, helping CDS developers anticipate and prevent CDS malfunctions before they occur or detect and resolve them expediently.
APA, Harvard, Vancouver, ISO, and other styles
43

Mamaghani and Salvaggio. "Multispectral Sensor Calibration and Characterization for sUAS Remote Sensing." Sensors 19, no. 20 (2019): 4453. http://dx.doi.org/10.3390/s19204453.

Full text
Abstract:
This paper focuses on the calibration of multispectral sensors typically used for remote sensing. These systems are often provided with "factory" radiometric calibration and vignette correction parameters. These parameters, which are assumed to be accurate when the sensor is new, may change as the camera is utilized in real-world conditions. As a result, regular calibration and characterization of any sensor should be conducted. An end-user laboratory method for computing both the vignette correction and radiometric calibration function is discussed in this paper. As an exemplar, this method for radiance computation is compared to the method provided by MicaSense for their RedEdge series of sensors. The proposed method and the method provided by MicaSense for radiance computation are applied to a variety of images captured in the laboratory using a traceable source. In addition, a complete error propagation is conducted to quantify the error produced when images are converted from digital counts to radiance. The proposed methodology was shown to produce lower errors in radiance imagery. The average percent error in radiance was −10.98%, −0.43%, 3.59%, 32.81% and −17.08% using the MicaSense provided method and their "factory" parameters, while the proposed method produced errors of 3.44%, 2.93%, 2.93%, 3.70% and 0.72% for the blue, green, red, near infrared and red edge bands, respectively. To further quantify the error in terms commonly used in remote sensing applications, the error in radiance was propagated to a reflectance error and additionally used to compute errors in two widely used parameters for assessing vegetation health, NDVI and NDRE. For the NDVI example, the ground reference was computed to be 0.899 ± 0.006, while the provided MicaSense method produced a value of 0.876 ± 0.005 and the proposed method produced a value of 0.897 ± 0.007. For NDRE, the ground reference was 0.455 ± 0.028, MicaSense method produced 0.239 ± 0.026 and the proposed method produced 0.435 ± 0.038.
APA, Harvard, Vancouver, ISO, and other styles
44

Frnda, Jaroslav, Marek Durica, Jan Nedoma, Stanislav Zabka, Radek Martinek, and Michal Kostelansky. "A Weather Forecast Model Accuracy Analysis and ECMWF Enhancement Proposal by Neural Network." Sensors 19, no. 23 (2019): 5144. http://dx.doi.org/10.3390/s19235144.

Full text
Abstract:
This paper presents a neural network approach for weather forecast improvement. Predicted parameters, such as air temperature or precipitation, play a crucial role not only in the transportation sector but they also influence people’s everyday activities. Numerical weather models require real measured data for the correct forecast run. This data is obtained from automatic weather stations by intelligent sensors. Sensor data collection and its processing is a necessity for finding the optimal weather conditions estimation. The European Centre for Medium-Range Weather Forecasts (ECMWF) model serves as the main base for medium-range predictions among the European countries. This model is capable of providing forecast up to 10 days with horizontal resolution of 9 km. Although ECMWF is currently the global weather system with the highest horizontal resolution, this resolution is still two times worse than the one offered by limited area (regional) numeric models (e.g., ALADIN that is used in many European and north African countries). They use global forecasting model and sensor-based weather monitoring network as the input parameters (global atmospheric situation at regional model geographic boundaries, description of atmospheric condition in numerical form), and because the analysed area is much smaller (typically one country), computing power allows them to use even higher resolution for key meteorological parameters prediction. However, the forecast data obtained from regional models are available only for a specific country, and end-users cannot find them all in one place. Furthermore, not all members provide open access to these data. Since the ECMWF model is commercial, several web services offer it free of charge. Additionally, because this model delivers forecast prediction for the whole of Europe (and for the whole world, too), this attitude is more user-friendly and attractive for potential customers. Therefore, the proposed novel hybrid method based on machine learning is capable of increasing ECMWF forecast outputs accuracy to the same level as limited area models provide, and it can deliver a more accurate forecast in real-time.
APA, Harvard, Vancouver, ISO, and other styles
45

Pendrill, L. R., A. Allard, N. Fischer, P. M. Harris, J. Nguyen, and I. M. Smith. "Software to Maximize End-User Uptake of Conformity Assessment With Measurement Uncertainty, Including Bivariate Cases. The European EMPIR CASoft Project." NCSL International Measure 13, no. 1 (2021): 58–69. http://dx.doi.org/10.51843/measure.13.1.6.

Full text
Abstract:
Facilitating the uptake of established methodologies for risk-based decision-making in product conformity assessment taking into account measurement uncertainty by providing dedicated software is the aim of the European project EMPIR CASoft(2018–2020), involving the National Measurement Institutes from France, Sweden and the UK, and industrial partner Trescal (FR) as primary supporter. The freely available software helps end-users perform the required risk calculations in accordance with current practice and regulations and extends that current practice to include bivariate cases. The software is also aimed at supporting testing and calibration laboratories in the application of the latest version of the ISO/IEC 17025:2017 standard, which requires that“…the laboratory shall document the decision rule employed, taking into account the level of risk […] associated with the decision rule and apply the decision rule.” Initial experiences following launch of the new software in Spring 2020 are reported.
APA, Harvard, Vancouver, ISO, and other styles
46

Mahmud, Mahadzer. "Spreadsheet Solutions To Laplace's Equation: Seepage And Flow Net." Jurnal Teknologi, May 11, 1996, 53–67. http://dx.doi.org/10.11113/jt.v25.1008.

Full text
Abstract:
There has been an increasing interest in applying the electronic spreadsheets, traditionally used for the financial and accounting purposes, to solve complex engineering problems. This paper aims to describe the application of spreadsheets to solve the finite difference to Laplace's equation. The simple form of finite difference method also known as the relaxation method will be used for seepage analysis.The spreadsheet program used for this study was chosen based on the power, flexibility and the versatility of this package in various computing environments. Nevertheless any other spreadsheet programs can also perform the same task with varying levels of user friendliness. Keywords: Spreadsheet modelling, flow net, seepage, finite difference method
APA, Harvard, Vancouver, ISO, and other styles
47

Feoh, Gerson, and Putu WidaGunawan. "PENGUKURAN TINGKAT KEPUASAN PENGGUNA SITUS WEB PEMERINTAH (E-GOVERNMENT) KABUPATEN BADUNG." Jurnal Teknologi Informasi dan Komputer 2, no. 2 (2016). http://dx.doi.org/10.36002/jutik.v2i2.149.

Full text
Abstract:
<p>ABSTRACT<br />Currently the Badung regency administration has had a website Electronic Government (e-Gov), butdoes not have definitive data about the data indicator of the level of user satisfaction. This study aims todetermine the extent of user satisfaction level Government website (e-Gov) Badung through measuring the level of user satisfaction using a questionnaire with a variable declaration End-User Computing Satisfaction (EUCS) and measured with a model of Kano. The sample used to determine the level of user satisfaction ratings in this study were students living in Badung as a potentia luser of the web site Government (e-Gov) Badung regency. Expected results in this study are quantitative analysis results by questionnaire to obtain the results of the calculation of the satisfaction level from website users e-Gov in Badung regency administration.<br />Keywords: e-Government, Badung, Kano Model, EUCS (End-User Computing Satisfaction).<br />ABSTRAK<br />Saat ini Pemerintah Kabupaten Badung telah memiliki situs web Electronic Government (e-Gov), namun belum mempunyai data yang pasti mengenai data indicator tingkat kepuasan pengguna. Penelitian ini bertujuan untuk mengetahui sampai sejauh mana tingkat kepuasan pengguna situs web Pemerintah (e-Gov) Kabupaten Badung melalui pengukuran tingkat kepuasan pengguna menggunakan kuesioner dengan variable pernyataan End-User Computing Satisfaction (EUCS) dan diukur dengan model Kano. Sampel yang digunakan untuk mengetahui penilaian tingkat kepuasan pengguna dalam penelitian ini adalah mahasiswa yang tinggal di Kabupaten Badung sebagai pengguna potensial situs web Pemerintah (e-Gov) KabupatenBadung. Hasil yang diharapkan dalam penelitian ini adalah suatu hasil analisa secara kuantitatif dengan metode kuesioner untuk memperoleh hasil perhitungan tingkat kepuasan pengguna situs web e-Gov di Kabupaten Badung. Kata Kunci : e-Government, Kabupaten Badung, Model Kano, EUCS (End-User Computing Satisfaction).</p>
APA, Harvard, Vancouver, ISO, and other styles
48

Chakravarthi, SS, and RJ Kannan. "Detection of anomalies in cloud services using network flowdata analysis." International Journal of Electrical Engineering & Education, February 2, 2020, 002072092090143. http://dx.doi.org/10.1177/0020720920901436.

Full text
Abstract:
Cloud computing has paved an excellent platform for the emergence of cost-effective technological solutions. However, security and privacy issues still remain as a stringent challenge during service catering. Explicitly, the service utility anomalies are liable to cause severe privacy and security issues in cloud service delivery. So, the overall performance of cloud service consumption and end-user applications’ service levels utility is degraded. The open access and distributed nature of the cloud computing is the major reason for its vulnerability to intruders. The security and privacy in cloud services have many challenges and problems still open for research. This paper proposes an intrusion detection method capable of detecting nine categories of attacks in two stages. This paper focuses on establishing a network-based intrusion detection mechanism using machine learning techniques. A model will be constructed with a supervised learning methodology using historical network flowdata and flowdata collected from the Internet.
APA, Harvard, Vancouver, ISO, and other styles
49

"Comparative Analysis: Intrusion Detection in Multi-Cloud Environment to Identify Way Forward." International Journal of Advanced Trends in Computer Science and Engineering 10, no. 3 (2021): 2533–39. http://dx.doi.org/10.30534/ijatcse/2021/1451032021.

Full text
Abstract:
Cloud computing is the emerging platform that is covering individual and corporate needs swiftly. The spread of this global platform is ranging from infrastructure to various middleware, front-end and back- end services. At corporate level, another effective configuration of this phenomenon is multi-cloud environment, which is depicting the ultimate control of the end-user on engaging services from various cloud service providers depending on the service ranking, cost and availability. It is therefore, now very much desirable to have infrastructure services from one service provider while data services are performed on another cloud or having infrastructure services in a distributed environment on multiple clouds. Multi-cloud environment is closely linked with smartly configured security mechanism to ensure the security at rest and in transit. Intrusion detection at various levels and services of cloud platform is not an easy task and when it is spread over multiple clouds then the challenge becomes more complex and tedious. On the other side, managing and integrating a multi-cloud computing environment is also highly complex. From technical point of view, it requires experience and hi-tech skills to formulate sustainable integration between multiple clouds and a coherence among various services to provide an encapsulated platform for the end-user. As in a multicolor environment, the integration can be focused on Infrastructure-as-a-Service (IaaS), Software-as-a-Service (SaaS) and Platform-as-a-Service (PaaS) from various cloud service providers therefore an API-consistent cloud environment is required which leads to the security and more specifically intrusion detection. The problem arises when most of the existing network based intrusion detection systems are designed to deal with the known threats and attacks. These systems are dependent on a rule base that is sufficient to work in certain environment but in case of multi-cloud integration, such fixed rule bases and known-resilience becomes a point of concern. It is therefore, required to look at the intrusion detection system, which may adapt the environmental changes as well as can at least indicate the unknown / anomaly attacks or detection. Honeypot is a vibrant mechanism to divert attention of the unknown attackers and able to capture data to analyze the anomaly. Honeypots may not be so useful independently but along with an intrusion detection system; this mechanism works efficiently and provides tangible results. This research paper is focused on analyzing the multi-cloud environment, intrusion detection systems and the use of honeypots in the existing solutions to understand the possible configurations for effective results in making a sustainable, secure and scalable multi-cloud environment.
APA, Harvard, Vancouver, ISO, and other styles
50

Liu, Chia-Hui, Tzer-Long Chen, Chien-Yun Chang, and Zhen-Yu Wu. "A reliable authentication scheme of personal health records in cloud computing." Wireless Networks, August 12, 2021. http://dx.doi.org/10.1007/s11276-021-02743-7.

Full text
Abstract:
AbstractA patient-centered personal health records system has been actively promoted in recent years. Its purpose is to maintain long-term personal records and health improvement plans. It combines a cloud computing environment to build a personal health records system to quickly collect personal information and transfer it to the back end for storage for future access. However, in a cloud environment, the message transmission process is more open. Therefore, a lack of an authority security mechanism for the users of such an architecture will result in distrust and doubt by the users. This adversely affects the implementation and quality of long-term health plans. To protect the crucial privacy of the users from malicious attacks or theft, it is necessary to ensure that the users have different authority to access their personal health records under the cloud computing environment and manage the openness of their authority to other users. A secured identify authentication mechanism can ensure that only legitimate users can log in to the system and obtain system service resources through verification. For a personal health records system in the cloud computing environment, this study proposes a secure and reliable user authentication mechanism allowing relevant users access to the user’s PHR in the cloud based on their authority. The proposed authentication method uses a password combined with a smart card, allowing the owner and authorized users to log in to the system and access the relevant personal records. In this study, an authentication method based on bilinear pairing was used to verify the identity of users and to effectively prevent malicious intrusion and theft.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography