To see the other types of publications on this topic, follow the link: DPMP [Distributed Policy Management Protocol].

Journal articles on the topic 'DPMP [Distributed Policy Management Protocol]'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 29 journal articles for your research on the topic 'DPMP [Distributed Policy Management Protocol].'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Guerrero Alonso, Juan Ignacio, Enrique Personal, Sebastián García, Antonio Parejo, Mansueto Rossi, Antonio García, Federico Delfino, Ricardo Pérez, and Carlos León. "Flexibility Services Based on OpenADR Protocol for DSO Level." Sensors 20, no. 21 (November 3, 2020): 6266. http://dx.doi.org/10.3390/s20216266.

Full text
Abstract:
Nowadays, Distribution System Operators are increasing the digitalization of their smart grids, making it possible to measure and manage their state at any time. However, with the massive eruption of change-distributed generation (e.g., renewable resources, electric vehicles), the grid operation have become more complex, requiring specific technologies to balance it. In this sense, the demand-side management is one of its techniques; the demand response is a promising approach for providing Flexibility Services (FSs) and complying with the regulatory directives of the energy market. As a solution, this paper proposes the use of the OpenADR (Open Automated Demand Response) standard protocol in combination with a Decentralized Permissioned Market Place (DPMP) based on Blockchain. On one hand, OpenADR hierarchical architecture based on distributed nodes provides communication between stakeholders, adding monitoring and management services. Further, this architecture is compatible with an aggregator schema that guarantees the compliance with the strictest regulatory framework (i.e., European market). On the other hand, DPMP is included at different levels of this architecture, providing a global solution to Flexibility Service Providers (FSP) that can be adapted depending on the regulation of a specific country. As a proof of concept, this paper shows the result of a real experimental case, which implements a Capacity Bidding Program where the OpenADR protocol is used as a communication method to control and monitor energy consumption. In parallel, the proposed DPMP based on Blockchain makes it possible to manage the incentives of FSs, enabling the integration of local and global markets.
APA, Harvard, Vancouver, ISO, and other styles
2

Benaouda, A., and N. Benaouda. "A Preventive Multi-agent based Policy for Distributed Hardware Resource Balancing." Engineering, Technology & Applied Science Research 10, no. 3 (June 7, 2020): 5824–31. http://dx.doi.org/10.48084/etasr.3548.

Full text
Abstract:
The success of a business, especially a multi-site extended enterprise, depends on the good management of all its distributed resources. It is difficult for a company to be successful if it does not have a reliable and optimal management of resources by avoiding overstocking of certain resources on a site Sitem ∈ E, and at the same time, the sub-storing of the same resources on another site Sitep ∈ E. In both cases, there is a lack of profit. In this paper, we will try to resolve this situation, by the proposal of an architecture based on the cooperative multi-agent systems paradigm combined with the Contract-Net protocol. We bring in an intelligent agent whose role is to warn in advance and for each item itemi ∈ Sitem, the coming of breakdowns and stock excesses by balancing the level of inter-site availability by a flow of resources of the same itemi by calling on the other E sites whose levels are in over-storage or under-storage.
APA, Harvard, Vancouver, ISO, and other styles
3

Jabarulla, Mohamed Yaseen, and Heung-No Lee. "Blockchain-Based Distributed Patient-Centric Image Management System." Applied Sciences 11, no. 1 (December 28, 2020): 196. http://dx.doi.org/10.3390/app11010196.

Full text
Abstract:
In recent years, many researchers have focused on developing a feasible solution for storing and exchanging medical images in the field of health care. Current practices are deployed on cloud-based centralized data centers, which increase maintenance costs, require massive storage space, and raise privacy concerns about sharing information over a network. Therefore, it is important to design a framework to enable sharing and storing of big medical data efficiently within a trustless environment. In the present paper, we propose a novel proof-of-concept design for a distributed patient-centric image management (PCIM) system that is aimed to ensure safety and control of patient private data without using a centralized infrastructure. In this system, we employed an emerging Ethereum blockchain and a distributed file system technology called Inter-Planetary File System (IPFS). Then, we implemented an Ethereum smart contract called the patient-centric access control protocol to enable a distributed and trustworthy access control policy. IPFS provides the means for decentralized storage of medical images with global accessibility. We describe how the PCIM system architecture facilitates the distributed and secured patient-centric data access across multiple entities such as hospitals, patients, and image requestors. Finally, we deployed a smart contract prototype on an Ethereum testnet blockchain and evaluated the proposed framework within the Windows environment. The evaluation results demonstrated that the proposed scheme is efficient and feasible.
APA, Harvard, Vancouver, ISO, and other styles
4

Son, Bumho, Jaewook Lee, and Huisu Jang. "A Scalable IoT Protocol via an Efficient DAG-Based Distributed Ledger Consensus." Sustainability 12, no. 4 (February 18, 2020): 1529. http://dx.doi.org/10.3390/su12041529.

Full text
Abstract:
The Internet of Things (IoT) suffers from various security vulnerabilities. The use of blockchain technology can help resolve these vulnerabilities, but some practical problems in terms of scalability continue to hinder the adaption of blockchain for application in the IoT. The directed acyclic graph (DAG)-based Tangle model proposed by the IOTA Foundation aims to avoid transaction fees by employing a different protocol from that used in the blockchain. This model uses the Markov chain Monte Carlo (MCMC) algorithm to update a distributed ledger. However, concerns about centralization by the coordinator nodes remain. Additionally, the economic incentive to choose the algorithm is insufficient. The present study proposes a light and efficient distributed ledger update algorithm that regards only the subtangle of each step by considering the Bayesian inference. Experimental results have confirmed that the performance of the proposed methodology is similar to that of the existing methodology, and the proposed methodology enables a faster computation time. It also provides the same resistance to possible attacks, and for the same reasons, as does the MCMC algorithm.
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Aiya, Xianhua Wei, and Zhou He. "Robust Proof of Stake: A New Consensus Protocol for Sustainable Blockchain Systems." Sustainability 12, no. 7 (April 2, 2020): 2824. http://dx.doi.org/10.3390/su12072824.

Full text
Abstract:
In the digital economy era, the development of a distributed robust economy system has become increasingly important. The blockchain technology can be used to build such a system, but current mainstream consensus protocols are vulnerable to attack, making blockchain systems unsustainable. In this paper, we propose a new Robust Proof of Stake (RPoS) consensus protocol, which uses the amount of coins to select miners and limits the maximum value of the coin age to effectively avoid coin age accumulation attack and Nothing-at-Stake (N@S) attack. Under a comparison framework, we show that the RPoS equals or outperforms Proof of Work (PoW) protocol and Proof of Stake (PoS) protocol in three dimensions: energy consumption, robustness, and transaction processing speed. To compare the three consensus protocols in terms of trade efficiency, we built an agent-based model and find that RPoS protocol has greater or similar trade request-satisfied ratio than PoW and PoS. Hence, we suggest that RPoS is very suitable for building a robust digital economy distributed system.
APA, Harvard, Vancouver, ISO, and other styles
6

Alaya, Bechir, and Rehanullah Khan. "QoS Enhancement In VoD Systems: Load Management And Replication Policy Optimization Perspectives." Computer Journal 63, no. 10 (July 1, 2020): 1547–63. http://dx.doi.org/10.1093/comjnl/bxaa060.

Full text
Abstract:
Abstract The amount of online video content is exponentially increasing, which spurs its access demands. Providing optimal quality of service (QoS) for this ever-increasing video data is a challenging task due to the number of QoS constraints. The system resources, the distributed system platform and the transport protocol thus all need to collaborate to guarantee an acceptable level of QoS for the optimal video streaming process. In this paper, we present a comprehensive survey on QoS management for the video-on-demand systems. First, we focus on load management and replication algorithms in content delivery networks and peer-to-peer (P2P) networks for their shortcomings. We also address the problem of admission control and resource allocation with the objectives of congestion avoidance and frame-loss reduction. Besides, we introduce and discuss various replication schemes. For both the client–server architecture and P2P networks, we highlight the need for a specific storage management policy to preserve system reliability and content availability. We also focus on content distribution and streaming protocols scaling. We deduce that content availability is linked to the characteristics and the performance of the streaming protocols. Finally, we create a comparison table that presents the different contributions of the discussed approaches as well as their limitations. We believe that such a comprehensive survey provides useful insights and contributes to the related domains.
APA, Harvard, Vancouver, ISO, and other styles
7

Zhao, Fangyuan, Xin Guo, and Wai Kin (Victor) Chan. "Individual Green Certificates on Blockchain: A Simulation Approach." Sustainability 12, no. 9 (May 11, 2020): 3942. http://dx.doi.org/10.3390/su12093942.

Full text
Abstract:
Distributed renewable energy offers an exciting opportunity for sustainable transition and climate change mitigation. However, it is overlooked in most of the conventional tradable green certificates programs. Blockchain shows an advantage of incorporating a galaxy of distributed prosumers in a transparent and low-cost manner. This paper proposes I-Green, a blockchain-based individual green certificates system for promoting voluntary adoption of distributed renewable energy. Combing the features of blockchain technology and the theories of social norm and peer effects, the novel green ratio incentive scheme and proof of generation consensus protocol are designed for I-Green. A blockchain simulator is constructed to evaluate the effectiveness and efficiency of I-Green system. The simulation results present its potential for facilitating widespread adoption of distributed generation, and confirm the feasibility of blockchain as the information communication technology (ICT).
APA, Harvard, Vancouver, ISO, and other styles
8

Ariza, Leidy D., and Carlos R. Orjuela. "Why Implement Distributed Systems in Municipal Music Schools in Colombia?" Modern Applied Science 11, no. 9 (August 24, 2017): 92. http://dx.doi.org/10.5539/mas.v11n9p92.

Full text
Abstract:
In Colombia, since 2003, the public policy "National Plan of Music for citizen Coexistence" has been implemented as a government measure, which provides music courses in each one of the country's municipalities. This plan does not take into account the use of technologies to share the experiences of each one of the schools.Taking into account the above- mentioned points, this article focuses its attention on the search of technologies that can be used to share multimedia content such as Content Delivery Network (CDN), Learning Management System (LCMS) and Distributed Systems in order to indicate which technology is the most appropriate to fulfill this purpose.In that sense, through the Wireshark tool, network traffic is captured for each one of the tests performed: Upload, display and deletion of videos for each configured technology (CDN, LCMS and Distributed Systems), having as comparison parameters the following aspects: Real-time Traffic, Total Traffic Vs Packet Loss, Communication Exchange and Protocol Hierarchies. After doing that, we proceed to take statistics to be analyzed and obtain the comparative results that are needed for this research.Finally, one can conclude the comparison of the results of each technology: that it is appropriate that the municipal schools of music use distributed systems because the size of the packets sent is smaller than that one that is sent by CDN and CML technologies. There are no multiple communication jumps. No prior approval is required to publish the content and there is no limitation on the size of the content to be published.
APA, Harvard, Vancouver, ISO, and other styles
9

Valdez, Rupa S., Christopher Lunsford, Jiwoon Bae, Lisa C. Letzkus, and Jessica Keim-Malpass. "Self-Management Characterization for Families of Children With Medical Complexity and Their Social Networks: Protocol for a Qualitative Assessment." JMIR Research Protocols 9, no. 1 (January 23, 2020): e14810. http://dx.doi.org/10.2196/14810.

Full text
Abstract:
Background Children with medical complexity (CMC) present rewarding but complex challenges for the health care system. Transforming high-quality care practices for this population requires multiple stakeholders and development of innovative models of care. Importantly, care coordination requires significant self-management by families in home- and community-based settings. Self-management often requires that families of CMC rely on vast and diverse social networks, encompassing both online and offline social relationships with individuals and groups. The result is a support network surrounding the family to help accomplish self-management of medical tasks and care coordination. Objective The goal of this study is to use a theoretically driven perspective to systematically elucidate the range of self-management experiences across families of CMC embedded in diverse social networks and contextual environments. This approach will allow for characterization of the structure and process of self-management of CMC with respect to social networks, both in person and digitally. This research proposal aims to address the significant gaps in the self-management literature surrounding CMC, including the following: (1) how self-management responsibilities are distributed and negotiated among the social network and (2) how individual-, family-, and system-level factors influence self-management approaches for CMC from a theoretically driven perspective. Methods This study will encompass a qualitative descriptive approach to understand self-management practices among CMC and their social networks. Data collection and analysis will be guided by a theoretical and methodological framework, which synthesizes perspectives from nursing, human factors engineering, public health, and family counseling. Data collection will consist of semistructured interviews with children, parents, and social network members, inclusive of individuals such as friends, neighbors, and community members, as well as online communities and individuals. Data analysis will consist of a combination of inductive and deductive methods of qualitative content analysis, which will be analyzed at both individual and multiadic levels, where interview data from two or more individuals, focused on the same experience, will be comparatively analyzed. Results This study will take approximately 18 months to complete. Our long-term goals are to translate the qualitative analysis into (1) health IT design guidance for innovative approaches to self-management and (2) direct policy guidance for families of CMC enrolled in Medicaid and private insurance. Conclusions Multiple innovative components of this study will enable us to gain a comprehensive and nuanced understanding of the lived experience of self-management of CMC. In particular, by synthesizing and applying theoretical and methodological approaches from multiple disciplines, we plan to create novel informatics and policy solutions to support their care within home and community settings. International Registered Report Identifier (IRRID) PRR1-10.2196/14810
APA, Harvard, Vancouver, ISO, and other styles
10

Pan, Yiguang, and Xiaomei Deng. "Incentive Mechanism Design for Distributed Autonomous Organizations Based on the Mutual Insurance Scenario." Complexity 2021 (August 2, 2021): 1–16. http://dx.doi.org/10.1155/2021/9947360.

Full text
Abstract:
The rise of blockchain has led to discussions on new governance models and the cooperation of multiple participants. Due to the cognitive defects of the blockchain protocol in terms of intelligent contracts and decentralized autonomous organizations (DAOs), it is often unclear as to how to make decisions about the evolution of blockchain applications. Many autonomous organizations, with the support of network technologies such as blockchain, blindly absorb members and expand the scale of the capital pool, while ignoring the cost advantage of traditional autonomous organizations based on social relations and mutual supervision to fight information asymmetry. In this context, this study analyzes the evolutionary trend of autonomous organizations and their members’ strategies under different policy environments. To this end, under the digital economy background, based on game theory, the evolutionary dynamics method, and the form of the mutual insurance organization, this study constructs an evolutionary dynamics model of distributed autonomous organizations. The results show that blind expansion without review aggravates the overall risk pool’s moral hazard, in the context of mutual insurance. Organizational strategies, such as risk pool splits, can effectively improve the risk pool’s operating performance and establish a benign competition elimination mechanism. Driven by cooperation efficiency and split supervision based on homogeneous clustering, the comprehensive application of the market elimination mechanism can effectively combat moral hazards, restrain the adverse effects of member flow, expand the living space of small- and medium-sized insurance organizations, curb the emergence of a large-scale monopoly risk pool, and improve market vitality. These conclusions and suggestions also apply to autonomous organizations based on social relations and mutual supervision. The results offer specific decision-making guidance and suggestions for the government, insurance companies, and risk management.
APA, Harvard, Vancouver, ISO, and other styles
11

van Leeuwen, Evelien H., Machteld van den Heuvel, Eva Knies, and Toon W. Taris. "Career Crafting Training Intervention for Physicians: Protocol for a Randomized Controlled Trial." JMIR Research Protocols 9, no. 10 (October 8, 2020): e18432. http://dx.doi.org/10.2196/18432.

Full text
Abstract:
Background Physicians work in a highly demanding work setting where ongoing changes affect their work and challenge their employability (ie, their ability and willingness to continue working). In this high-pressure environment, physicians could benefit from proactively managing or crafting their careers; however, they tend not to show this behavior. The new concept of career crafting concerns proactively making choices and adapting behavior regarding both short-term job design (ie, job crafting) as well as longer-term career development (ie, career self-management). However, so far, no intervention studies have aimed at enhancing career crafting behavior among physicians. Given that proactive work and career behavior have been shown to be related to favorable outcomes, we designed an intervention to support career crafting behavior and employability of physicians. Objective The objectives of this study were to describe (1) the development and (2) the design of the evaluation of a randomized controlled career crafting intervention to increase job crafting, career self-management, and employability. Methods A randomized controlled intervention study was designed for 141 physicians in two Dutch hospitals. The study was designed and will be evaluated based on parts of the intervention mapping protocol. First, needs of physicians were assessed through 40 interviews held with physicians and managers. This pointed to a need to support physicians in becoming more proactive regarding their careers as well as in building awareness of proactive behaviors in order to craft their current work situation. Based on this, a training program was developed in line with their needs. A number of theoretical methods and practical applications were selected as the building blocks of the training. Next, participants were randomly assigned to either the waitlist-control group (ie, received no training) or the intervention group. The intervention group participated in a 4-hour training session and worked on four self-set goals. Then, a coaching conversation took place over the phone. Digital questionnaires distributed before and 8 weeks after the intervention assessed changes in job crafting, career self-management, employability, and changes in the following additional variables: job satisfaction, career satisfaction, work-home interference, work ability, and performance. In addition, a process evaluation was conducted to examine factors that may have promoted or hindered the effectiveness of the intervention. Results Data collection was completed in March 2020. Evaluation of outcomes and the research process started in April 2020. Study results were submitted for publication in September 2020. Conclusions This study protocol gives insight into the systematic development and design of a career crafting training intervention that is aimed to enhance job crafting, career self-management, and employability. This study will provide valuable information to physicians, managers, policy makers, and other researchers that aim to enhance career crafting. International Registered Report Identifier (IRRID) RR1-10.2196/18432
APA, Harvard, Vancouver, ISO, and other styles
12

Han, Tao, Seyed Bozorgi, Ayda Orang, Ali Hosseinabadi, Arun Sangaiah, and Mu-Yen Chen. "A Hybrid Unequal Clustering Based on Density with Energy Conservation in Wireless Nodes." Sustainability 11, no. 3 (January 31, 2019): 746. http://dx.doi.org/10.3390/su11030746.

Full text
Abstract:
The Internet of things (IoT) provides the possibility of communication between smart devices and any object at any time. In this context, wireless nodes play an important role in reducing costs and simple use. Since these nodes are often used in less accessible locations, recharging their battery is hardly feasible and in some cases is practically impossible. Hence, energy conservation within each node is a challenging discussion. Clustering is an efficient solution to increase the lifetime of the network and reduce the energy consumption of the nodes. In this paper, a novel hybrid unequal multi-hop clustering based on density (HCD) is proposed to increase the network lifetime. In the proposed protocol, the cluster head (CH) selection is performed only by comparing the status of each node to its neighboring nodes. In this new technique, the parameters involving energy of nodes, the number of neighboring nodes, the distance to the base station (BS), and the layer where the node is placed in are considered in CH selection. So, in this new and simple technique considers energy consumption of the network and load balancing. Clustering is performed unequally so that cluster heads (CHs) close to BS have more energy for data relay. Also, a hybrid dynamic–static clustering was performed to decrease overhead. In the current protocol, a distributed clustering and multi-hop routing approach was applied between cluster members (CMs), to CHs, and CHs to BS. HCD is applied as a novel assistance to cluster heads (ACHs) mechanism, in a way that a CH accepts to use member nodes with suitable state to share traffic load. Furthermore, we performed simulation for two different scenarios. Simulation results showed the reliability of the proposed method as it was resulted in a significant increase in network stability and energy balance as well as network lifetime and efficiency.
APA, Harvard, Vancouver, ISO, and other styles
13

Gerey, AbdolGhafour, Amirpouya Sarraf, and Hassan Ahmadi. "Groundwater Single- and Multiobjective Optimization Using Harris Hawks and Multiobjective Billiards-Inspired Algorithm." Shock and Vibration 2021 (June 30, 2021): 1–16. http://dx.doi.org/10.1155/2021/4531212.

Full text
Abstract:
This is the first attempt to combine the Multiobjective Billiards-Inspired Optimization Algorithm (MOBOA) with groundwater modelling to determine pumping rates within a well-distributed range of Pareto options. In this study, in order to determine an optimum solution for groundwater drawdown, pumping rates were selected accompanied by three minimization objectives: minimizing shortage influenced by inability to supply, adjusted shortage index, and minimizing the degree of drawdown within predefined areas. To optimize hydraulic conductivity and specific yield parameters of a modular three-dimensional finite-difference (MODFLOW) groundwater model, the Harris Hawks optimization algorithm was used to minimize the sum of absolute deviation between observed and simulated water-table levels. MOBOA was then utilized to optimize pumping rate variables for an Iranian arid to semiarid groundwater environment using these parameters. As the study results, when the maximum and minimum aquifer drawdown was specified in the range of −40 to +40 cm/year, the Pareto parameter sets produced satisfactory results. Overall, the “Simulation-Optimization-Modelling” protocol was able to generate a series of optimal solutions that were shown on a Pareto front. The study concluded to an optimum approach that provides policy makers in the Iranian water stressed zones with safe groundwater management alternatives.
APA, Harvard, Vancouver, ISO, and other styles
14

Murphy, Chris, Leanne Atkin, Jenny Hurlow, Terry Swanson, and Melina Vega de Ceniga. "Wound hygiene survey: awareness, implementation, barriers and outcomes." Journal of Wound Care 30, no. 7 (July 2, 2021): 582–90. http://dx.doi.org/10.12968/jowc.2021.30.7.582.

Full text
Abstract:
Objective: In light of the COVID-19 pandemic, which has resulted in changes to caseload management, access to training and education, and other additional pressures, a survey was developed to understand current awareness and implementation of the wound hygiene concept into practice one year on from its dissemination. Barriers to implementation and outcomes were also surveyed. Method: The 26-question survey, a mixture of multiple choice and free-text, was developed by the Journal of Wound Care projects team, in consultation with ConvaTec, and distributed globally via email and online; the survey was open for just over 12 weeks. Due to the exploratory nature of the research, non-probability sampling was used. The authors reviewed the outputs of the survey to draw conclusions from the data, with the support of a medical writer. Results: There were 1478 respondents who agreed to the use of their anonymised aggregated data. Nearly 90% were from the US or UK, and the majority worked in wound care specialist roles, equally distributed between community and acute care settings; 66.6% had been in wound care for more than 8 years. The respondents work across the spectrum of wound types. More than half (57.4%) had heard of the concept of wound hygiene, of whom 75.3% have implemented it; 78.7% answered that they ‘always’ apply wound hygiene and 20.8% ‘sometimes’ do so. The top three barriers to adoption were confidence (39.0%), the desire for more research (25.7%) and competence (24.8%). Overall, following implementation of wound hygiene, 80.3% reported that their patients' healing rates had improved. Conclusion: Respondents strongly agreed that implementing wound hygiene is a successful approach for biofilm management and a critical component for improving wound healing rates in hard-to-heal wounds. However, the barriers to its uptake and implementation demonstrate that comprehensive education and training, institutional support for policy and protocol changes, and more clinical research are needed to support wound hygiene.
APA, Harvard, Vancouver, ISO, and other styles
15

Corbett, Mark, David Marshall, Melissa Harden, Sam Oddie, Robert Phillips, and William McGuire. "Treatment of extravasation injuries in infants and young children: a scoping review and survey." Health Technology Assessment 22, no. 46 (August 2018): 1–112. http://dx.doi.org/10.3310/hta22460.

Full text
Abstract:
BackgroundExtravasation injuries are caused by unintended leakages of fluids or medicines from intravenous lines, but there is no consensus on the best treatment approaches.ObjectivesTo identify which treatments may be best for treating extravasation injuries in infants and young children.DesignScoping review and survey of practice.PopulationChildren aged < 18 years with extravasation injuries and NHS staff who treat children with extravasation injuries.InterventionsAny treatment for extravasation injury.Main outcome measuresWound healing time, infection, pain, scarring, functional impairment, requirement for surgery.Data sourcesTwelve database searches were carried out in February 2017 without date restrictions, including MEDLINE, CINAHL (Cumulative Index to Nursing and Allied Health Literature) Plus and EMBASE (Excerpta Medica dataBASE).MethodsScoping review – studies were screened in duplicate. Data were extracted by one researcher and checked by another. Studies were grouped by design, and then by intervention, with details summarised narratively and in tables. The survey questionnaire was distributed to NHS staff at neonatal units, paediatric intensive care units and principal oncology/haematology units. Summary results were presented narratively and in tables and figures.ResultsThe evidence identified in the scoping review mostly comprised small, retrospective, uncontrolled group studies or case reports. The studies covered a wide range of interventions including conservative management approaches, saline flush-out techniques (with or without prior hyaluronidase), hyaluronidase (without flush-out), artificial skin treatments, debridement and plastic surgery. Few studies graded injury severity and the results sections and outcomes reported in most studies were limited. There was heterogeneity across study populations in age, types of infusate, injury severity, location of injury and the time gaps between injury identification and subsequent treatment. Some of the better evidence related to studies of flush-out techniques. The NHS survey yielded 63 responses from hospital units across the UK. Results indicated that, although most units had a written protocol or guideline for treating extravasation injuries, only one-third of documents included a staging system for grading injury severity. In neonatal units, parenteral nutrition caused most extravasation injuries. In principal oncology/haematology units, most injuries were due to vesicant chemotherapies. The most frequently used interventions were elevation of the affected area and analgesics. Warm or cold compresses were rarely used. Saline flush-out treatments, either with or without hyaluronidase, were regularly used in about half of all neonatal units. Most responders thought a randomised controlled trial might be a viable future research design, though opinions varied greatly by setting.LimitationsPaucity of good-quality studies.ConclusionsThere is uncertainty about which treatments are most promising, particularly with respect to treating earlier-stage injuries. Saline flush-out techniques and conservative management approaches are commonly used and may be suitable for evaluation in trials.Future workConventional randomised trials may be difficult to perform, although a randomised registry trial may be an appropriate alternative.FundingThe National Institute for Health Research Health Technology Assessment programme.
APA, Harvard, Vancouver, ISO, and other styles
16

Ali, Jehad, Gyu-min Lee, Byeong-hee Roh, Dong Kuk Ryu, and Gyudong Park. "Software-Defined Networking Approaches for Link Failure Recovery: A Survey." Sustainability 12, no. 10 (May 22, 2020): 4255. http://dx.doi.org/10.3390/su12104255.

Full text
Abstract:
Deployment of new optimized routing rules on routers are challenging, owing to the tight coupling of the data and control planes and a lack of global topological information. Due to the distributed nature of the traditional classical internet protocol networks, the routing rules and policies are disseminated in a decentralized manner, which causes looping issues during link failure. Software-defined networking (SDN) provides programmability to the network from a central point. Consequently, the nodes or data plane devices in SDN only forward packets and the complexity of the control plane is handed over to the controller. Therefore, the controller installs the rules and policies from a central location. Due to the central control, link failure identification and restoration becomes pliable because the controller has information about the global network topology. Similarly, new optimized rules for link recovery can be deployed from the central point. Herein, we review several schemes for link failure recovery by leveraging SDN while delineating the cons of traditional networking. We also investigate the open research questions posed due to the SDN architecture. This paper also analyzes the proactive and reactive schemes in SDN using the OpenDayLight controller and Mininet, with the simulation of application scenarios from the tactical and data center networks.
APA, Harvard, Vancouver, ISO, and other styles
17

Raptis, George, Mercedes Perez-Botella, Rebecca Totterdell, Konstantinos Gerasimidis, and Louise J. Michaelis. "A survey of school’s preparedness for managing anaphylaxis in pupils with food allergy." European Journal of Pediatrics 179, no. 10 (April 5, 2020): 1537–45. http://dx.doi.org/10.1007/s00431-020-03645-0.

Full text
Abstract:
Abstract Allergic diseases are on the increase and can affect the child’s well-being. The aim of this survey was to assess regional schools’ preparedness in dealing with anaphylaxis following the publication of national and international guidelines for schools in 2014. The survey was developed in 2015 and distributed to schools in Cumbria, North West England, UK between 2015 and 2016. Only 47% of the respondents (95% CI, 39–57%) felt confident to manage anaphylaxis. Schools without allergic pupils were significantly less likely to have a standard management protocol in place for emergencies compared to those with allergic pupils (p < 0.001). The majority of the schools indicated that further training was needed (81% (95% CI, 74–88%). Conclusion: At the time of the survey, schools’ preparedness in the region, did not meet safety standards recommended by national and international organisations. Although schools have shown eagerness in accessing training in the management of anaphylaxis, tailored training for schools is not yet widely available. There is now an urgent need to design feasible training strategies that create a safe environment for allergic pupils across all UK schools. What is Known:• One quarter of the severe allergic reactions take place for the first time while at school with some of them being fatal.• School staff is ill-prepared in the management of anaphylaxis. Access to formal training is not widely available. What is New:• School staff remains unconfident in managing the severe allergic child.Training in the management of anaphylaxis is scarce, and when available, it does not offer the required depth to cover the holistic needs of allergic pupils.• Schools would welcome generic adrenaline autoinjectors and a national policy with central funding which would describe step by step the necessary measures for the management of anaphylaxis.
APA, Harvard, Vancouver, ISO, and other styles
18

Bryk, Amy, Susannah Koontz, JoAl Mayor, Jeffrey Betcher, Rebecca Tombleson, Ryan Bookout, and Ila M. Saunders. "Characterization of collaborative practice agreements held by hematopoietic stem cell transplant pharmacists." Journal of Oncology Pharmacy Practice 25, no. 3 (December 5, 2017): 558–66. http://dx.doi.org/10.1177/1078155217745145.

Full text
Abstract:
Background Current workforce shortages within the hematopoietic stem cell transplant field necessitate capitalizing on the role of oncology-trained pharmacists. Working within an agreed-upon protocol, pharmacists are able to expand patient care delivery through optimal medication therapy management. Methods An electronic survey was developed by the Advocacy & Policy Working Committee of the American Society for Blood and Marrow Transplantation Pharmacy Special Interest Group and distributed to pharmacists involved in the care of hematopoietic stem cell transplant patients. The primary objective was to assess the current state of collaborative practice agreements in the hematopoietic stem cell transplant setting. Results Forty-eight responses representing 41 institutions were returned. Respondents were mostly female (67%) and practiced in the adult setting (83%). Reponses represented a range of practice experience in hematopoietic stem cell transplant with the majority of the hematopoietic stem cell transplant positions (83%) funded by the department of pharmacy at an academic medical center. Of the 48 responses, 22 (46%) respondents reported having collaborative practice agreements in place; 10 (21%) respondents did not currently have collaborative practice agreements, but were planning to implement them; and 16 (33%) respondents did not have collaborative practice agreements at their institution. Clinical activities performed under a collaborative practice agreement included medication selection and dosing modifications, therapeutic drug monitoring, supportive care management, and management of comorbid conditions and chronic diseases. The most commonly cited barrier to establishing collaborative practice agreements was the inability to secure reimbursement for services provided. No respondents reported a negative impact on job satisfaction. Conclusions The results of this survey provide the pharmacy community with a robust understanding of the current landscape of hematopoietic stem cell transplant pharmacy collaborative practice agreements.
APA, Harvard, Vancouver, ISO, and other styles
19

El-khani, Ussamah, Hutan Ashrafian, Shahnawaz Rasheed, Harald Veen, Ammar Darwish, David Nott, and Ara Darzi. "The patient safety practices of emergency medical teams in disaster zones: a systematic analysis." BMJ Global Health 4, no. 6 (November 2019): e001889. http://dx.doi.org/10.1136/bmjgh-2019-001889.

Full text
Abstract:
IntroductionDisaster zone medical relief has been criticised for poor quality care, lack of standardisation and accountability. Traditional patient safety practices of emergency medical teams (EMTs) in disaster zones were not well understood. Improving the quality of healthcare in disaster zones has gained importance within global health policy. Ascertaining patient safety practices of EMTs in disaster zones may identify areas of practice that can be improved.MethodsA systematic search of OvidSP, Embase and Medline databases; key journals of interest; key grey literature texts; the databases of the WHO, Médecins Sans Frontieres and the International Committee of the Red Cross; and Google Scholar was performed. Descriptive studies, case reports, case series, prospective trials and opinion pieces were included with no limitation on date or language of publication.ResultsThere were 9685 records, evenly distributed between the peer-reviewed and grey literature. Of these, 30 studies and 9 grey literature texts met the inclusion criteria and underwent qualitative synthesis. From these articles, 302 patient safety statements were extracted. Thematic analysis categorised these statements into 84 themes (total frequency 632). The most frequent themes were limb injury (9%), medical records (5.4%), surgery decision-making (4.6%), medicines safety (4.4%) and protocol (4.4%).ConclusionPatient safety practices of EMTs in disaster zones are weighted toward acute clinical care, particularly surgery. The management of non-communicable disease is under-represented. There is widespread recognition of the need to improve medical record-keeping. High-quality data and institutional level patient safety practices are lacking. There is no consensus on disaster zone-specific performance indicators. These deficiencies represent opportunities to improve patient safety in disaster zones.
APA, Harvard, Vancouver, ISO, and other styles
20

Campano, Miguel Ángel, Samuel Domínguez-Amarillo, Jesica Fernández-Agüera, and Juan José Sendra. "Thermal Perception in Mild Climate: Adaptive Thermal Models for Schools." Sustainability 11, no. 14 (July 19, 2019): 3948. http://dx.doi.org/10.3390/su11143948.

Full text
Abstract:
A comprehensive assessment of indoor environmental conditions is performed on a representative sample of classrooms in schools across southern Spain (Mediterranean climate) to evaluate the thermal comfort level, thermal perception and preference, and the relationship with HVAC systems, with a comparison of seasons and personal clothing. Almost fifty classrooms were studied and around one thousand pool-surveys distributed among their occupants, aged 12 to 17. These measurements were performed during spring, autumn, and winter, considered the most representative periods of use for schools. A new proposed protocol has been developed for the collection and subsequent analysis of data, applying thermal comfort indicators and using the most frequent predictive models, rational (RTC) and adaptive (ATC), for comparison. Cooling is not provided in any of the rooms and natural ventilation is found in most of the spaces during midseasons. Despite the existence of a general heating service in almost all classrooms in the cold period, the use of mechanical ventilation is limited. Heating did not usually provide standard set-point temperatures. However, this did not lead to widespread complaints, as occupants perceive the thermal environment as neutral—varying greatly between users—and show a preference for slightly colder environments. Comparison of these thermal comfort votes and the thermal comfort indicators used showed a better fit of thermal preference over thermal sensation and more reliable results when using regional ATC indicators than the ASHRAE adaptive model. This highlights the significance of inhabitants’ actual thermal perception. These findings provide useful insight for a more accurate design of this type of building, as well as a suitable tool for the improvement of existing spaces, improving the conditions for both comfort and wellbeing in these spaces, as well as providing a better fit of energy use for actual comfort conditions.
APA, Harvard, Vancouver, ISO, and other styles
21

Ruibal, Monica, Rod Peakall, Andrew Claridge, Andrew Murray, and Karen Firestone. "Advancement to hair-sampling surveys of a medium-sized mammal: DNA-based individual identification and population estimation of a rare Australian marsupial, the spotted-tailed quoll (Dasyurus maculatus)." Wildlife Research 37, no. 1 (2010): 27. http://dx.doi.org/10.1071/wr09087.

Full text
Abstract:
Context. Enumeration of cryptic/rare or widely distributed mammal species is exceedingly difficult for wildlife managers using standard survey methods. Individual identification via non-invasive hair-DNA methods offers great promise in extending the information available from hairs collected to survey for presence/absence of a species. However, surprisingly few wildlife studies have attempted this because of potential limitations with the field method and genetic samples. Aim. The applicability of hair DNA to identify individuals and estimate numbers was assessed for a rare, medium-sized Australian marsupial carnivore, the spotted-tailed quoll (Dasyurus maculatus). Methods. Hair samples were obtained remotely in the field with baited hair-sampling devices (known as handi-glaze hair tubes) that permit multiple visitations by individuals and species. A hierarchical approach developed and applied to the DNA extraction and PCR protocol, based on single and four pooled hairs of each collected sample, was used to assess genotype reliability (cross-species DNA mixing, allelic dropout and false allele errors) and enumerate the local study population. These results were compared against a concurrent live-cage trapping survey that was equivalent in scale and trap density to enable a rigorous evaluation of the efficiency and reliability of the DNA-based hair-sampling technique. Key results. Of the 288 hair devices deployed, 52 (18%) captured spotted-tailed quoll hair and the majority (90%) of these samples provided adequate DNA to genetically profile individuals at 10 microsatellite loci and a sexing marker. The hierarchical approach provided a feasible way to verify whether cross-species DNA mixing had occurred in the pooled-hair DNA extracts by comparing the results against the independent single-hair DNA extract, and assess genotyping reliability of both DNA concentrations. Fewer individuals were detected using hair-sampling (n = 16) than live-trapping (n = 21), despite hair-sampling occurring over a longer period (40 cf. 26 nights). Conclusions. The population-level information gained by the DNA-based technologies adds considerable value to the remote hair-sampling method which up until the present study had been used to detect the presence of medium-sized mammals. Our study demonstrated the utility of the DNA-based hair-sampling method to identify spotted-tailed quoll individuals and for surveying local populations. However, improvements to the hair-sampling method, such as increasing the density of stations or the provision of a food reward, should be considered to enhance sampling efficiency to allow the enumeration of local populations. Implications. The use of remote hair-sampling devices that permit multiple visitations and do not require daily collection can be feasible and reliable to genetically identify individuals when coupled with appropriate strategies. By combining single- and pooled-hair DNA extracts, a good compromise between laboratory efficiency and data integrity is afforded.
APA, Harvard, Vancouver, ISO, and other styles
22

Leng, Manman, Yang Yu, Shengping Wang, and Zhiqiang Zhang. "Simulating the Hydrological Processes of a Meso-Scale Watershed on the Loess Plateau, China." Water 12, no. 3 (March 20, 2020): 878. http://dx.doi.org/10.3390/w12030878.

Full text
Abstract:
The Soil and Water Assessment Tool (SWAT) model is widely used to simulate watershed streamflow by integrating complex interactions between climate, geography, soil, vegetation, land use/land cover and other human activities. Although there have been many studies involving sensitivity analysis, uncertainty fitting, and performance evaluation of SWAT model all over the world, identifying dominant parameters and confirming actual hydrological processes still remain essential for studying the effect of climate and land use change on the hydrological regime in some water-limited regions. We used hydro-climate and spatial geographical data of a watershed with an area of 3919 km2, located on the Loess Plateau of China, to explore the suitable criterion to select parameters for running the model, and to elucidate the dominant ones that govern the hydrological processes for achieving the sound streamflow simulation. Our sensitivity analysis results showed that parameters not passing the sensitive check (p-value < 0.05) could play a significant role in hydrological simulation rather than only the parameters with p-value lower than 0.05, indicating that the common protocol is not appropriate for selecting parameters by sensitivity screening only. Superior performance of the rarely used parameter SOL_BD was likely caused by a combination of lateral and vertical movement of water in the loess soils due to the run-on infiltration process that occurred for meso-scale watershed monthly streamflow modeling, contrasting with traditionally held infiltration excessive overland flow dominated runoff generation mechanisms that prevail on the Loess Plateau. Overall, the hydrological processes of meso-scale watershed in the region could be well simulated by the model though underestimates of monthly streamflow could occur. Simulated water balance results indicated that the evapotranspiration in the region was the main component leaving the watershed, accounting for 88.9% of annual precipitation. Surface runoff contributed to 63.2% of the streamflow, followed by lateral flow (36.6%) and groundwater (0.2%). Our research highlights the importance for selecting more appropriate parameters for distributed hydrological models, which could help modelers to better comprehend the meso-scale watershed runoff generation mechanism of the Loess Plateau and provide policy makers robust tool for developing sustainable watershed management planning in water-limited regions.
APA, Harvard, Vancouver, ISO, and other styles
23

Castellino, Sharon M., Angela Punnett, Susan K. Parsons, Nicholas P. DeGroote, Sally Muehle, Hongli Li, Michael E. Roth, et al. "Evaluation of Activation of a NCI National Clinical Trials Network (NCTN) Study S1826 for Hodgkin Lymphoma (HL) at Children's Oncology Group (COG) Institutions." Blood 136, Supplement 1 (November 5, 2020): 20. http://dx.doi.org/10.1182/blood-2020-134738.

Full text
Abstract:
Background: HL is an adolescent and young adult (AYA) cancer that lacks uniform approaches across medical and pediatric oncology. Differences include risk classification, chemotherapy backbone and use of radiation therapy. Heterogeneity in institutional programs and resources for AYAs adds to the gap in understanding why outcomes for AYA HL differ. In order to expedite equitable access to novel agents for AYA patients by medical and pediatric oncology providers, a NCTN facilitated trial for advanced stage HL was launched. The SWOG-led S1826 trial (NCT03907488), open to patients &gt; 12 years of age, was activated in July 2019. We assessed barriers and facilitators to trial activation at COG institutions for this first in-kind approach. Methods: A web-based survey was distributed through the COG communications office to institutional principal investigators (PIs) of 216 institutions in North America. To achieve optimal response rates, the survey was distributed in four waves over a 6-week period. Branching logic differentiated questions for institutions that had opened or planned to open the trial vs. those who did not. Topics included institutional characteristics, joint partnership with medical oncologists to activate AYA trials, and specific barriers for opening this trial. Descriptive statistics were calculated using SAS v.7.1. Results: The response rate was 73% with 158 unique responses among 216 COG institutions queried. Among responding institutions 24% were freestanding children's hospitals; 18% were NCI-designated cancer centers. 55% of respondents indicated a known affiliation with another NCTN cooperative group other than COG. 31% indicated prior experience in participating in a non-COG led NCTN trial for other diseases. 42% of institutions reported a central trials infrastructure for joint pediatric and medical oncology trials. 44% indicated use of the central IRB mechanism, and 4% used a provincial IRB. While 40% had an established AYA oncology program, 30% reported regular lymphoma tumor boards with medical oncology; 8% indicated the ability to see AYA lymphoma patients in a joint pediatric and medical oncology clinic. The trial is open at 79/158 (50%) COG institutions to date and an additional 56 indicated future intent to open the trial. Among 135 COG institutions with open or intent-to-open status, 73% of institutional principal investigator (PI) were pediatric oncologists, 24% were medical oncologists and 4% were joint PIs. PI determination was based on: enrolling as a COG-only site (57%); institutional policy (5%); a discussion among investigators (23%); or other factors (14%). These were categorized as: more resources or anticipated patients in medical oncology (n=4); the trial being opened in medical oncology before pediatrics (n=11); being open in pediatrics before medical oncology (n=2); no interface for joint studies (n=1). Among the 14% of respondents who indicated the trial would not be opened, a competing trial was the reason in 35%. Other reasons included: lack of awareness of the trial, concerns about study design or chemotherapy backbone, lack of easily accessible protocol documents, anticipated lack of accrual, concerns around funding support, challenges with regulatory support, data management, or institutional process for medical and pediatric joint trials. Respondents' recommendations for facilitating activation of AYA intergroup studies include needs for: increased resources and funding; guidance on communication and navigation with medical oncologists for managing joint trials; institutional infrastructure for AYA trials; clearer rationale for a change in the chemotherapy backbone relative to prior COG studies; accessibility and consistency of protocol study naming conventions and protocol documents (i.e. therapy roadmap) on the COG electronic site. Conclusions: Successful implementation of AYA trials is germane to early access to novel agents for younger adolescents. Overall, COG institutions indicate a high level of endorsement for a NCTN AYA trial for HL with 85% indicating activation completed or planned. This survey suggests that AYA trials can be implemented successfully in a network but require education, early communication between pediatric and medical oncologists, and flexible infrastructure for all group participants. (Funding: U10CA180886, U10CA180888, and UG1CA233230) Disclosures Parsons: Seattle Genetics: Consultancy. Herrera:Bristol Myers Squibb: Consultancy, Other: Travel, Accomodations, Expenses, Research Funding; Merck: Consultancy, Research Funding; Genentech, Inc./F. Hoffmann-La Roche Ltd: Consultancy, Research Funding; Gilead Sciences: Consultancy, Research Funding; Seattle Genetics: Consultancy, Research Funding; Immune Design: Research Funding; AstraZeneca: Research Funding; Karyopharm: Consultancy; Pharmacyclics: Research Funding. Friedberg:Acerta Pharma - A member of the AstraZeneca Group, Bayer HealthCare Pharmaceuticals.: Other; Astellas: Consultancy; Kite Pharmaceuticals: Research Funding; Bayer: Consultancy; Seattle Genetics: Research Funding; Roche: Other: Travel expenses; Portola Pharmaceuticals: Consultancy.
APA, Harvard, Vancouver, ISO, and other styles
24

Nayyar, Anand, Rudra Rameshwar, and Piyush Kanti Dutta. "Special Issue on Recent Trends and Future of Fog and Edge Computing, Services and Enabling Technologies." Scalable Computing: Practice and Experience 20, no. 2 (May 2, 2019): iii—vi. http://dx.doi.org/10.12694/scpe.v20i2.1558.

Full text
Abstract:
Recent Trends and Future of Fog and Edge Computing, Services, and Enabling Technologies Cloud computing has been established as the most popular as well as suitable computing infrastructure providing on-demand, scalable and pay-as-you-go computing resources and services for the state-of-the-art ICT applications which generate a massive amount of data. Though Cloud is certainly the most fitting solution for most of the applications with respect to processing capability and storage, it may not be so for the real-time applications. The main problem with Cloud is the latency as the Cloud data centres typically are very far from the data sources as well as the data consumers. This latency is ok with the application domains such as enterprise or web applications, but not for the modern Internet of Things (IoT)-based pervasive and ubiquitous application domains such as autonomous vehicle, smart and pervasive healthcare, real-time traffic monitoring, unmanned aerial vehicles, smart building, smart city, smart manufacturing, cognitive IoT, and so on. The prerequisite for these types of application is that the latency between the data generation and consumption should be minimal. For that, the generated data need to be processed locally, instead of sending to the Cloud. This approach is known as Edge computing where the data processing is done at the network edge in the edge devices such as set-top boxes, access points, routers, switches, base stations etc. which are typically located at the edge of the network. These devices are increasingly being incorporated with significant computing and storage capacity to cater to the need for local Big Data processing. The enabling of Edge computing can be attributed to the Emerging network technologies, such as 4G and cognitive radios, high-speed wireless networks, and energy-efficient sophisticated sensors. Different Edge computing architectures are proposed (e.g., Fog computing, mobile edge computing (MEC), cloudlets, etc.). All of these enable the IoT and sensor data to be processed closer to the data sources. But, among them, Fog computing, a Cisco initiative, has attracted the most attention of people from both academia and corporate and has been emerged as a new computing-infrastructural paradigm in recent years. Though Fog computing has been proposed as a different computing architecture than Cloud, it is not meant to replace the Cloud. Rather, Fog computing extends the Cloud services to network edges for providing computation, networking, and storage services between end devices and data centres. Ideally, Fog nodes (edge devices) are supposed to pre-process the data, serve the need of the associated applications preliminarily, and forward the data to the Cloud if the data are needed to be stored and analysed further. Fog computing enhances the benefits from smart devices operational not only in network perimeter but also under cloud servers. Fog-enabled services can be deployed anywhere in the network, and with these services provisioning and management, huge potential can be visualized to enhance intelligence within computing networks to realize context-awareness, high response time, and network traffic offloading. Several possibilities of Fog computing are already established. For example, sustainable smart cities, smart grid, smart logistics, environment monitoring, video surveillance, etc. To design and implementation of Fog computing systems, various challenges concerning system design and implementation, computing and communication, system architecture and integration, application-based implementations, fault tolerance, designing efficient algorithms and protocols, availability and reliability, security and privacy, energy-efficiency and sustainability, etc. are needed to be addressed. Also, to make Fog compatible with Cloud several factors such as Fog and Cloud system integration, service collaboration between Fog and Cloud, workload balance between Fog and Cloud, and so on need to be taken care of. It is our great privilege to present before you Volume 20, Issue 2 of the Scalable Computing: Practice and Experience. We had received 20 Research Papers and out of which 14 Papers are selected for Publication. The aim of this special issue is to highlight Recent Trends and Future of Fog and Edge Computing, Services and Enabling technologies. The special issue will present new dimensions of research to researchers and industry professionals with regard to Fog Computing, Cloud Computing and Edge Computing. Sujata Dash et al. contributed a paper titled “Edge and Fog Computing in Healthcare- A Review” in which an in-depth review of fog and mist computing in the area of health care informatics is analysed, classified and discussed. The review presented in this paper is primarily focussed on three main aspects: The requirements of IoT based healthcare model and the description of services provided by fog computing to address then. The architecture of an IoT based health care system embedding fog computing layer and implementation of fog computing layer services along with performance and advantages. In addition to this, the researchers have highlighted the trade-off when allocating computational task to the level of network and also elaborated various challenges and security issues of fog and edge computing related to healthcare applications. Parminder Singh et al. in the paper titled “Triangulation Resource Provisioning for Web Applications in Cloud Computing: A Profit-Aware” proposed a novel triangulation resource provisioning (TRP) technique with a profit-aware surplus VM selection policy to ensure fair resource utilization in hourly billing cycle while giving the quality of service to end-users. The proposed technique use time series workload forecasting, CPU utilization and response time in the analysis phase. The proposed technique is tested using CloudSim simulator and R language is used to implement prediction model on ClarkNet weblog. The proposed approach is compared with two baseline approaches i.e. Cost-aware (LRM) and (ARMA). The response time, CPU utilization and predicted request are applied in the analysis and planning phase for scaling decisions. The profit-aware surplus VM selection policy used in the execution phase for select the appropriate VM for scale-down. The result shows that the proposed model for web applications provides fair utilization of resources with minimum cost, thus provides maximum profit to application provider and QoE to the end users. Akshi kumar and Abhilasha Sharma in the paper titled “Ontology driven Social Big Data Analytics for Fog enabled Sentic-Social Governance” utilized a semantic knowledge model for investigating public opinion towards adaption of fog enabled services for governance and comprehending the significance of two s-components (sentic and social) in aforesaid structure that specifically visualize fog enabled Sentic-Social Governance. The results using conventional TF-IDF (Term Frequency-Inverse Document Frequency) feature extraction are empirically compared with ontology driven TF-IDF feature extraction to find the best opinion mining model with optimal accuracy. The results concluded that implementation of ontology driven opinion mining for feature extraction in polarity classification outperforms the traditional TF-IDF method validated over baseline supervised learning algorithms with an average of 7.3% improvement in accuracy and approximately 38% reduction in features has been reported. Avinash Kaur and Pooja Gupta in the paper titled “Hybrid Balanced Task Clustering Algorithm for Scientific workflows in Cloud Computing” proposed novel hybrid balanced task clustering algorithm using the parameter of impact factor of workflows along with the structure of workflow and using this technique, tasks can be considered for clustering either vertically or horizontally based on value of impact factor. The testing of the algorithm proposed is done on Workflowsim- an extension of CloudSim and DAG model of workflow was executed. The Algorithm was tested on variables- Execution time of workflow and Performance Gain and compared with four clustering methods: Horizontal Runtime Balancing (HRB), Horizontal Clustering (HC), Horizontal Distance Balancing (HDB) and Horizontal Impact Factor Balancing (HIFB) and results stated that proposed algorithm is almost 5-10% better in makespan time of workflow depending on the workflow used. Pijush Kanti Dutta Pramanik et al. in the paper titled “Green and Sustainable High-Performance Computing with Smartphone Crowd Computing: Benefits, Enablers and Challenges” presented a comprehensive statistical survey of the various commercial CPUs, GPUs, SoCs for smartphones confirming the capability of the SCC as an alternative to HPC. An exhaustive survey is presented on the present and optimistic future of the continuous improvement and research on different aspects of smartphone battery and other alternative power sources which will allow users to use their smartphones for SCC without worrying about the battery running out. Dhanapal and P. Nithyanandam in the paper titled “The Slow HTTP Distributed Denial of Service (DDOS) Attack Detection in Cloud” proposed a novel method to detect slow HTTP DDoS attacks in cloud to overcome the issue of consuming all available server resources and making it unavailable to the real users. The proposed method is implemented using OpenStack cloud platform with slowHTTPTest tool. The results stated that proposed technique detects the attack in efficient manner. Mandeep Kaur and Rajni Mohana in the paper titled “Static Load Balancing Technique for Geographically partitioned Public Cloud” proposed a novel approach focused upon load balancing in the partitioned public cloud by combining centralized and decentralized approaches, assuming the presence of fog layer. A load balancer entity is used for decentralized load balancing at partitions and a controller entity is used for centralized level to balance the overall load at various partitions. Results are compared with First Come First Serve (FCFS) and Shortest Job First (SJF) algorithms. In this work, the researchers compared the Waiting Time, Finish Time and Actual Run Time of tasks using these algorithms. To reduce the number of unhandled jobs, a new load state is introduced which checks load beyond conventional load states. Major objective of this approach is to reduce the need of runtime virtual machine migration and to reduce the wastage of resources, which may be occurring due to predefined values of threshold. Mukta and Neeraj Gupta in the paper titled “Analytical Available Bandwidth Estimation in Wireless Ad-Hoc Networks considering Mobility in 3-Dimensional Space” proposes an analytical approach named Analytical Available Bandwidth Estimation Including Mobility (AABWM) to estimate ABW on a link. The major contributions of the proposed work are: i) it uses mathematical models based on renewal theory to calculate the collision probability of data packets which makes the process simple and accurate, ii) consideration of mobility under 3-D space to predict the link failure and provides an accurate admission control. To test the proposed technique, the researcher used NS-2 simulator to compare the proposed technique i.e. AABWM with AODV, ABE, IAB and IBEM on throughput, Packet loss ratio and Data delivery. Results stated that AABWM performs better as compared to other approaches. R.Sridharan and S. Domnic in the paper titled “Placement Strategy for Intercommunicating Tasks of an Elastic Request in Fog-Cloud Environment” proposed a novel heuristic IcAPER,(Inter-communication Aware Placement for Elastic Requests) algorithm. The proposed algorithm uses the network neighborhood machine for placement, once current resource is fully utilized by the application. The performance IcAPER algorithm is compared with First Come First Serve (FCFS), Random and First Fit Decreasing (FFD) algorithms for the parameters (a) resource utilization (b) resource fragmentation and (c) Number of requests having intercommunicating tasks placed on to same PM using CloudSim simulator. Simulation results shows IcAPER maps 34% more tasks on to the same PM and also increase the resource utilization by 13% while decreasing the resource fragmentation by 37.8% when compared to other algorithms. Velliangiri S. et al. in the paper titled “Trust factor based key distribution protocol in Hybrid Cloud Environment” proposed a novel security protocol comprising of two stages: first stage, Group Creation using the trust factor and develop key distribution security protocol. It performs the communication process among the virtual machine communication nodes. Creating several groups based on the cluster and trust factors methods. The second stage, the ECC (Elliptic Curve Cryptography) based distribution security protocol is developed. The performance of the Trust Factor Based Key Distribution protocol is compared with the existing ECC and Diffie Hellman key exchange technique. The results state that the proposed security protocol has more secure communication and better resource utilization than the ECC and Diffie Hellman key exchange technique in the Hybrid cloud. Vivek kumar prasad et al. in the paper titled “Influence of Monitoring: Fog and Edge Computing” discussed various techniques involved for monitoring for edge and fog computing and its advantages in addition to a case study based on Healthcare monitoring system. Avinash Kaur et al. elaborated a comprehensive view of existing data placement schemes proposed in literature for cloud computing. Further, it classified data placement schemes based on their assess capabilities and objectives and in addition to this comparison of data placement schemes. Parminder Singh et al. presented a comprehensive review of Auto-Scaling techniques of web applications in cloud computing. The complete taxonomy of the reviewed articles is done on varied parameters like auto-scaling, approach, resources, monitoring tool, experiment, workload and metric, etc. Simar Preet Singh et al. in the paper titled “Dynamic Task Scheduling using Balanced VM Allocation Policy for Fog Computing Platform” proposed a novel scheme to improve the user contentment by improving the cost to operation length ratio, reducing the customer churn, and boosting the operational revenue. The proposed scheme is learnt to reduce the queue size by effectively allocating the resources, which resulted in the form of quicker completion of user workflows. The proposed method results are evaluated against the state-of-the-art scene with non-power aware based task scheduling mechanism. The results were analyzed using parameters-- energy, SLA infringement and workflow execution delay. The performance of the proposed schema was analyzed in various experiments particularly designed to analyze various aspects for workflow processing on given fog resources. The LRR (35.85 kWh) model has been found most efficient on the basis of average energy consumption in comparison to the LR (34.86 kWh), THR (41.97 kWh), MAD (45.73 kWh) and IQR (47.87 kWh). The LRR model has been also observed as the leader when compared on the basis of number of VM migrations. The LRR (2520 VMs) has been observed as best contender on the basis of mean of number of VM migrations in comparison with LR (2555 VMs), THR (4769 VMs), MAD (5138 VMs) and IQR (5352 VMs).
APA, Harvard, Vancouver, ISO, and other styles
25

Kim, Yunmin, and Tae-Jin Lee. "Learning nodes: machine learning-based energy and data management strategy." EURASIP Journal on Wireless Communications and Networking 2021, no. 1 (September 15, 2021). http://dx.doi.org/10.1186/s13638-021-02047-6.

Full text
Abstract:
AbstractThe efficient use of resources in wireless communications has always been a major issue. In the Internet of Things (IoT), the energy resource becomes more critical. The transmission policy with the aid of a coordinator is not a viable solution in an IoT network, since a node should report its state to the coordinator for scheduling and it causes serious signaling overhead. Machine learning algorithms can provide the optimal distributed transmission mechanism with little overhead. A node can learn by itself by utilizing the machine learning algorithm and make the optimal transmission decision on its own. In this paper, we propose a novel learning Medium Access Control (MAC) protocol with learning nodes. Nodes learn the optimal transmission policy, i.e., minimizing the data and energy queue levels, using the Q-learning algorithm. The performance evaluation shows that the proposed scheme enhances the queue states and throughput.
APA, Harvard, Vancouver, ISO, and other styles
26

Pelletier, Dominique, David Roos, Marc Bouchoucha, Thomas Schohn, William Roman, Charles Gonson, Thomas Bockel, et al. "A Standardized Workflow Based on the STAVIRO Unbaited Underwater Video System for Monitoring Fish and Habitat Essential Biodiversity Variables in Coastal Areas." Frontiers in Marine Science 8 (July 29, 2021). http://dx.doi.org/10.3389/fmars.2021.689280.

Full text
Abstract:
Essential Biodiversity Variables (EBV) related to benthic habitats and high trophic levels such as fish communities must be measured at fine scale but monitored and assessed at spatial scales that are relevant for policy and management actions. Local scales are important for assessing anthropogenic impacts, and conservation-related and fisheries management actions, while reporting on the conservation status of biodiversity to formulate national and international policies requires much broader scales. Measurements must account for the fact that coastal habitats and fish communities are heterogeneously distributed locally and at larger scales. Assessments based on in situ monitoring generally suffer from poor spatial replication and limited geographical coverage, which is challenging for area-wide assessments. Requirements for appropriate monitoring comprise cost-efficient and standardized observation protocols and data formats, spatially scalable and versatile data workflows, data that comply with the FAIR (Findable, Accessible, Interoperable, and Reusable) principles, while minimizing the environmental impact of measurements. This paper describes a standardized workflow based on remote underwater video that aims to assess fishes (at species and community levels) and habitat-related EBVs in coastal areas. This panoramic unbaited video technique was developed in 2007 to survey both fishes and benthic habitats in a cost-efficient manner, and with minimal effect on biodiversity. It can be deployed in areas where low underwater visibility is not a permanent or major limitation. The technique was consolidated and standardized and has been successfully used in varied settings over the last 12 years. We operationalized the EBV workflow by documenting the field protocol, survey design, image post-processing, EBV production and data curation. Applications of the workflow are illustrated here based on some 4,500 observations (fishes and benthic habitats) in the Pacific, Indian and Atlantic Oceans, and Mediterranean Sea. The STAVIRO’s proven track-record of utility and cost-effectiveness indicates that it should be considered by other researchers for future applications.
APA, Harvard, Vancouver, ISO, and other styles
27

"American Society of Clinical Oncology Policy Statement: Oversight of Clinical Research." Journal of Clinical Oncology 21, no. 12 (June 15, 2003): 2377–86. http://dx.doi.org/10.1200/jco.2003.04.026.

Full text
Abstract:
Executive Summary: Well-publicized lapses in the review or implementation of clinical research studies have raised public questions about the integrity of the clinical research process. Public trust in the integrity of research is critical not only for funding and participation in clinical trials but also for confidence in the treatments that result from the trials. The questions raised by these unfortunate cases pose an important opportunity to reassess the clinical trials oversight system to ensure the integrity of clinical research and the safety of those who enroll in clinical trials. Since its inception, the American Society of Clinical Oncology (ASCO) has worked for the advancement of cancer treatments through clinical research and to help patients gain prompt access to scientifically excellent and ethically unimpeachable clinical trials. As an extension of its mission, ASCO is affirming with this policy statement the critical importance of a robust review and oversight system to ensure that clinical trials participants give fully informed consent and that their safety is a top priority. Ensuring the integrity of research cannot be stressed enough because of its seminal connection to the advancement of clinical cancer treatment. The overall goal of this policy is to enhance public trust in the cancer clinical trials process. To achieve this, the following elements are essential: Ensure safety precautions for clinical trial participants and their fully informed consent. Ensure the validity and integrity of scientific research. Enhance the educational training of clinical scientists and research staff to ensure the highest standards of research conduct. Promote accountability and responsibility among all those involved in clinical research (not just those serving on institutional review boards [IRBs], but also institutional officials, researchers, sponsors, and participants) and ensure support for an effective oversight process. Enhance the professional and public understanding of clinical research oversight. Enhance the efficiency and cost-effectiveness of the clinical research oversight system. This policy statement makes recommendations in several areas that serve as principles to support an improved system of oversight for clinical research. ASCO will work with all parties involved in the clinical research system to develop the steps necessary to implement these recommendations. Centralized Trial Review:A large percentage of oncology clinical trials are coordinated through the National Cancer Institute’s (NCI) system of cooperative groups, which already incorporates centralized scientific review. As such, there is a tremendous opportunity to employ a centralized mechanism to provide ethical review by highly trained IRB members, allowing local IRBs to take advantage of the financial and time efficiencies that central review provides. Centralized review boards (CRBs) would also contribute consistency and efficiency to the process. Once successfully completed, the review would represent an approval to open the protocol at all of the institutions that have subscribed to the centralized review system. Local IRBs would be able to devote time usually spent on initial review to ongoing monitoring of the trial taking place at their institution. Considering the enormous size and complexity of the clinical research enterprise, ASCO envisions multiple CRBs, which could be distributed as regional review boards. Central review will use a single protocol and consent form, and monitor and evaluate adverse events (AEs) on a global basis, eliminating many of the time-consuming steps for the local IRB. Global monitoring and assessment of AEs has real potential to enhance trial participants’ safety by giving local institutions more information on the overall trial and enabling them to devote more time to ongoing review of the trial onsite. Use of a CRB also has real potential to reduce the costs of clinical trial oversight by allowing local IRBs to eliminate the costs of initial review. These efficiencies will likely lead to institutions redirecting funds toward monitoring ongoing trials. Although a CRB has potential to improve the efficiency of the process, a CRB could also have tremendous ability to delay valuable trials. Checks and balances must be included in the newly devised system to ensure timely review and appeals of CRB actions. ASCO proposes the advent of a new pilot program for centralizing review of clinical trials. It requires clear engagement of all stakeholders in planning the experiment, clear articulation of the goals, and assurance of federal regulatory protection for institutions choosing to participate. If successful, this CRB pilot project could be expanded to include multi-institutional industry-sponsored research. Education and Training:Education and training are critical to the ultimate success of an improved oversight system. All members of the research team should receive comprehensive education on conducting scientifically and ethically valid clinical research. The curriculum should also include information on the prevailing local and federal regulations that pertain to the clinical trials process. IRB members should also receive ongoing education and training in the review of clinical research protocols. IRB training should pay particular attention to nonscientific members to give them the tools necessary to speak on behalf of research participants. ASCO should develop a curriculum that focuses on the proper conduct of human research and emphasize ethically sound clinical research in the context of its Annual Meeting. Informed Consent: Investigators and review boards have specific roles to play in ensuring the education of trial participants through the informed consent process, both when they are considering trial enrollment and as they participate in the trial. Review boards and investigators should focus primarily on the informed consent process, rather than the informed consent documents. Federal Oversight: The federal government has an important role to play in the oversight of clinical research. This role should be expanded to cover all research, not just that which is funded by the federal government or conducted with the oversight of the Food and Drug Administration (FDA). The Department of Health and Human Services (HHS) Office for Human Research Protections (OHRP) and the FDA should provide clear regulatory support and guidance for local institutions that choose to employ a CRB. In the case of the pilot CRB discussed in this policy statement, it should serve as the preferred option for the cancer cooperative group clinical trials. Ideally, the federal government should unify and streamline its regulations for the oversight of clinical research. Resources Supporting Clinical Research Infrastructure:An effective oversight process demands the highest quality scientific and ethical review and onsite monitoring of the safety of trial participants. This can only be accomplished by the involvement of an experienced IRB that receives funding, resources, and institutional support enabling it to fulfill its mandate. Conflict of Interest: Critical to the integrity of research is the absence of bias in the process. ASCO strongly recommends the adoption of standards for the identification, management, and, where appropriate, elimination of conflicts of interests, whether they are actual, potential, or apparent.
APA, Harvard, Vancouver, ISO, and other styles
28

Ståhls-Mäkelä, Gunilla, Anniina Kuusijärvi, Ville-Matti Riihikoski, Leif Schulman, and Aino Juslén. "Luomus’ Genomic Resources Collection Available as Open Data Through FinBIF." Biodiversity Information Science and Standards 3 (June 13, 2019). http://dx.doi.org/10.3897/biss.3.37024.

Full text
Abstract:
There is an increasing demand for high-quality genetic samples for biodiversity research as the techniques are rapidly developing and the costs are decreasing. The Finnish Museum of Natural History Luomus, an independent research institute within the University of Helsinki holding and developing the national natural history collections, has joined the Global Genome Biodiversity Network (GGBN; http://www.ggbn.org/ggbn_portal/) and established a Genomic Resources Collection (GRC) in 2018. In March 2019, the Luomus GRC comprised 2500 DNA extractions and 4000 vertebrate tissue samples amassed in approximately the last 10 years. The DNA extractions are mainly of lichens, polypores, beetles, flies, molluscs and crustaceans of worldwide origin, reflecting the focal organism groups of research groups in Luomus. The deep-frozen tissue samples are mostly of Finnish birds and mammals, as accessions of vertebrate specimens acquired to Luomus’ collections are sampled. High-quality whole-genome DNA extracts will also be prepared. We expect the GRC to increase rapidly in numbers of samples within the coming years. Furthermore, the collection will also serve the many active research groups in the Faculty of Biological and Environmental Sciences of the University of Helsinki. The GRC collection follows the best practices of the Global Genome Biodiversity Network (GGBN) concerning long-term storage and physical quality of samples, and international agreements (the Convention on Biological Diversity, the Nagoya Protocol, CITES) as regards the legitimacy of the samples. The GRC samples are always cross-linked with the taxonomically identified and georeferenced voucher specimen from which it is separated. Each GRC sample gets a Unique Resource Identifier HTTP-URI, which is a derivative of the unique specimen ID used in Luomus’ Collection Management System (CMS) ‘Kotka’. The sample tubes are cryolabelled with the QR code on the lid of the tube. The voucher specimens are deposited in Luomus’ collections or in another international public repository. The data on the GRC samples form part of the Open Data distributed through the Finnish Biodiversity Information Facility FinBIF species.fi (Data policy: https://laji.fi/en/about/960), and will be made searchable at the web portal in 2019. The specific database functions to meet the needs of Luomus’ GRC are developed by Luomus’ Biodiversity Informatics Unit and implemented in Kotka. We have already implemented part of the database tools to manage the compliance with the Nagoya protocol. The tool for registering material transactions (donations / loans) makes use of the Application Programming Interface (API) provided by the Access and Benefit Sharing Clearing House (ABS-CH) and includes links to the ABS-CH webpage (https://absch.cbd.int/). The ABS-CH shows the contact person or organization details of the provider country, and the country-specific requirements for access to genetic resources, when present. The necessary information and documentation (letter of Prior Informed Consent, Mutually Agreed Terms, Material Transaction Agreement, and other permits) are linked from the material transactions to the relevant specimens.
APA, Harvard, Vancouver, ISO, and other styles
29

Theeten, Franck, Marielle Adam, Thomas Vandenberghe, Mathias Dillen, Patrick Semal, Serge Scory, Jean-Marc Herpers, et al. "NaturalHeritage: Bridging Belgian natural history collections." Biodiversity Information Science and Standards 3 (July 4, 2019). http://dx.doi.org/10.3897/biss.3.37854.

Full text
Abstract:
The Royal Belgian Institute of Natural Sciences (RBINS), the Royal Museum for Central Africa (RMCA) and Meise Botanic Garden house more than 50 million specimens covering all fields of natural history. While many different research topics have their own specificities, throughout the years it became apparent that with regards to collection data management, data publication and exchange via community standards, collection holding institutions face similar challenges (James et al. 2018, Rocha et al. 2014). In the past, these have been tackled in different ways by Belgian natural history institutions. In addition to local and national collaborations, there is a great need for a joint structure to share data between scientific institutions in Europe and beyond. It is the aim of large networks and infrastructures such as the Global Biodiversity Information Facility (GBIF), the Biodiversity Information Standards (TDWG), the Distributed System of Scientific collections (DiSSCo) and the Consortium of European Taxonomic Facilities (CETAF) to further implement and improve these efforts, thereby gaining ever increasing efficiencies. In this context, the three institutions mentioned above, submitted the NaturalHeritage project (http://www.belspo.be/belspo/brain-be/themes_3_HebrHistoScien_en.stm) granted in 2017 by the Belgian Science Policy Service, which runs from 2017 to 2020. The project provides links among databases and services. The unique qualities of each database are maintained, while the information can be concentrated and exposed in a structured way via one access point. This approach aims also to link data that are unconnected at present (e.g. relationship between soil/substrate, vegetation and associated fauna) and to improve the cross-validation of data. (1) The NaturalHeritage prototype (http://www.naturalheritage.be) is a shared research portal with an open access infrastructure, which is still in the development phase. Its backbone is an ElasticSearch catalogue, with Kibana, and a Python aggregator gathering several types of (re)sources: relational databases, REpresentational State Transfer (REST) services of objects databases and bibliographical data, collections metadata and the GBIF Internet Publishing Toolkit (IPT) for observational and taxonomical data. Semi-structured data in English are semantically analysed and linked to a rich autocomplete mechanism. Keywords and identifiers are indexed and grouped in four categories (“what”, “who”, “where”, “when”). The portal can act also as an Open Archives Initiatives Protocol for Metadata Harvesting (OAI-PMH) service and ease indexing of the original webpage on the internet with microdata enrichment. (2) The collection data management system of DaRWIN (Data Research Warehouse Information Network) of RBINS and RMCA has been improved as well. External (meta)data requirements, i.e. foremost publication into or according to the practices and standards of GBIF and OBIS (Ocean Biogeographic Information System: https://obis.org) for biodiversity data, and INSPIRE (https://inspire.ec.europa.eu) for geological data, have been identified and evaluated. New and extended data structures have been created to be compliant with these standards, as well as the necessary procedures developed to expose the data. Quality control tools for taxonomic and geographic names have been developed. Geographic names can be hard to confirm as their lack of context often requires human validation. To address this a similarity measure is used to help map the result. Species, locations, sampling devices and other properties have been mapped to the World Register of Marine Species and DarwinCore (http://www.marinespecies.org), Marine Regions and GeoNames, the AGRO Agronomy and Vertebrate trait ontologies and the British Oceanographic Data Centre (BODC) vocabularies (http://www.obofoundry.org/ontology/agro.html). Extensive mapping is necessary to make use of the ExtendedMeasurementOrFact Extension of DarwinCore (https://tools.gbif.org/dwca-validator/extensions.do). External (meta)data requirements, i.e. foremost publication into or according to the practices and standards of GBIF and OBIS (Ocean Biogeographic Information System: https://obis.org) for biodiversity data, and INSPIRE (https://inspire.ec.europa.eu) for geological data, have been identified and evaluated. New and extended data structures have been created to be compliant with these standards, as well as the necessary procedures developed to expose the data. Quality control tools for taxonomic and geographic names have been developed. Geographic names can be hard to confirm as their lack of context often requires human validation. To address this a similarity measure is used to help map the result. Species, locations, sampling devices and other properties have been mapped to the World Register of Marine Species and DarwinCore (http://www.marinespecies.org), Marine Regions and GeoNames, the AGRO Agronomy and Vertebrate trait ontologies and the British Oceanographic Data Centre (BODC) vocabularies (http://www.obofoundry.org/ontology/agro.html). Extensive mapping is necessary to make use of the ExtendedMeasurementOrFact Extension of DarwinCore (https://tools.gbif.org/dwca-validator/extensions.do).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography