Artigos de revistas sobre o tema "Workflow efficiencies"

Siga este link para ver outros tipos de publicações sobre o tema: Workflow efficiencies.

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Veja os 50 melhores artigos de revistas para estudos sobre o assunto "Workflow efficiencies".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Veja os artigos de revistas das mais diversas áreas científicas e compile uma bibliografia correta.

1

Pang, Zhiqiang, Jasmine Chong, Shuzhao Li e Jianguo Xia. "MetaboAnalystR 3.0: Toward an Optimized Workflow for Global Metabolomics". Metabolites 10, n.º 5 (7 de maio de 2020): 186. http://dx.doi.org/10.3390/metabo10050186.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Liquid chromatography coupled to high-resolution mass spectrometry platforms are increasingly employed to comprehensively measure metabolome changes in systems biology and complex diseases. Over the past decade, several powerful computational pipelines have been developed for spectral processing, annotation, and analysis. However, significant obstacles remain with regard to parameter settings, computational efficiencies, batch effects, and functional interpretations. Here, we introduce MetaboAnalystR 3.0, a significantly improved pipeline with three key new features: (1) efficient parameter optimization for peak picking; (2) automated batch effect correction; and (3) more accurate pathway activity prediction. Our benchmark studies showed that this workflow was 20~100× faster compared to other well-established workflows and produced more biologically meaningful results. In summary, MetaboAnalystR 3.0 offers an efficient pipeline to support high-throughput global metabolomics in the open-source R environment.
2

RATNASWAMY, G., e V. DHARMAVARAM. "Improving Workflow Efficiencies in Protein Formulation Laboratories Using Visual Basic for Applications". Journal of the Association for Laboratory Automation 12, n.º 2 (abril de 2007): 90–98. http://dx.doi.org/10.1016/j.jala.2006.08.007.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Arthur, Michael A., e Millie L. Jackson. "Redesigning Technical Services for the 21st Century: A Case Study from the University of Alabama Libraries". Library Resources & Technical Services 64, n.º 3 (31 de julho de 2020): 120. http://dx.doi.org/10.5860/lrts.64n3.120-130.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
The University of Alabama Libraries began the process of workflow analysis over a decade ago. Primarily focused on the traditional technical services areas, this process has been iterative and has evolved from looking for efficiencies to a broader change in the culture and an acceptance of an ongoing process of improvement. This article distills lessons learned from workflow analysis in the areas of acquisitions, electronic resources, and cataloging/metadata but also examines how these changes impacted the broader library and philosophies of collection development and management.
4

Davies, Richard, e Sotirios Alpanis. "Behind the scenes of the British Library Qatar National Library Partnership: Digital operations and workflow management". Alexandria: The Journal of National and International Library and Information Issues 28, n.º 2 (agosto de 2018): 86–100. http://dx.doi.org/10.1177/0955749019839604.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
The British Library and Qatar National Library have been working in partnership since 2012 to make an extraordinary tranche of material available for everyone to access for free via the Qatar Digital Library. This article delves behind the scenes of the partnership, highlighting some of the operational and workflow lessons learnt when running this major digitisation programme. We look in detail at workflow management and coordination, how to use management information to find efficiencies, as well as providing a supportive team atmosphere for creative use of the collections via ‘Hack Days’.
5

Jhala, Meenakshi, e Rahul Menon. "Examining the impact of an asynchronous communication platform versus existing communication methods: an observational study". BMJ Innovations 7, n.º 1 (6 de outubro de 2020): 68–74. http://dx.doi.org/10.1136/bmjinnov-2019-000409.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
BackgroundHealthcare systems revolve around intricate relations between humans and technology. System efficiency depends on information exchange that occur on synchronous and asynchronous platforms. Traditional synchronous methods of communication may pose risks to workflow integrity and contribute to inefficient service delivery and medical care.AimTo compare synchronous methods of communication to Medic Bleep, an instant messaging asynchronous platform, and observe its impact on clinical workflow, quality of work life and associations with patient safety outcomes and hospital core operations.MethodsCohorts of healthcare professionals were followed using the Time Motion Study methodology over a 2-week period, using both the asynchronous platform and the synchronous methods like the non-cardiac pager. Questionnaires and interviews were conducted to identify staff attitudes towards both platforms.ResultsA statistically significant figure (p<0.01) of 20.1 minutes’ reduction in average task completion was seen with asynchronous communication, saving 58.8% of time when compared with traditional synchronous methods. In subcategory analysis for staff: doctors, nurses and midwifery categories, a p value of <0.0495 and <0.01 were observed; a mean time reduction with statistical significance was also seen in specific task efficiencies of ‘To-Take-Out (TTO), patient review, discharge & patient transfer and escalation of care & procedure’. The platform was favoured with an average Likert value of 8.7; 67% found it easy to implement.ConclusionThe asynchronous platform improved clinical communication compared with synchronous methods, contributing to efficiencies in workflow and may positively affect patient care.
6

Olakotan, Olufisayo Olusegun, e Maryati Mohd Yusof. "The appropriateness of clinical decision support systems alerts in supporting clinical workflows: A systematic review". Health Informatics Journal 27, n.º 2 (abril de 2021): 146045822110075. http://dx.doi.org/10.1177/14604582211007536.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
A CDSS generates a high number of inappropriate alerts that interrupt the clinical workflow. As a result, clinicians silence, disable, or ignore alerts, thereby undermining patient safety. Therefore, the effectiveness and appropriateness of CDSS alerts need to be evaluated. A systematic review was carried out to identify the factors that affect CDSS alert appropriateness in supporting clinical workflow. Seven electronic databases (PubMed, Scopus, ACM, Science Direct, IEEE, Ovid Medline, and Ebscohost) were searched for English language articles published between 1997 and 2018. Seventy six papers met the inclusion criteria, of which 26, 24, 15, and 11 papers are retrospective cohort, qualitative, quantitative, and mixed-method studies, respectively. The review highlights various factors influencing the appropriateness and efficiencies of CDSS alerts. These factors are categorized into technology, human, organization, and process aspects using a combination of approaches, including socio-technical framework, five rights of CDSS, and Lean. Most CDSS alerts were not properly designed based on human factor methods and principles, explaining high alert overrides in clinical practices. The identified factors and recommendations from the review may offer valuable insights into how CDSS alerts can be designed appropriately to support clinical workflow.
7

Chang, Philip, e Shyam Prabhakaran. "Recent advances in the management of acute ischemic stroke". F1000Research 6 (13 de abril de 2017): 484. http://dx.doi.org/10.12688/f1000research.9191.1.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
In recent years, several landmark trials have transformed acute ischemic stroke care. The most dramatic results from the field of acute endovascular intervention demonstrate unequivocal benefit for a select group of patients with moderate to severe deficits presenting within 7 hours from onset and with occlusions of proximal arteries in the anterior circulation. In addition, technological advances and workflow efficiencies have facilitated more rapid delivery of acute stroke interventions. This review provides an overview of recent advances in the management of acute ischemic stroke.
8

Lu, Xinyan. "Development of an Excel-based laboratory information management system for improving workflow efficiencies in early ADME screening". Bioanalysis 8, n.º 2 (janeiro de 2016): 99–110. http://dx.doi.org/10.4155/bio.15.232.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Coughlin, M. "75 Age Appropriate Care of the Premature and Hospitalized Infant: Nursing Workflow Efficiencies and Quality Clinical Outcomes". Archives of Disease in Childhood 97, Suppl 2 (1 de outubro de 2012): A21. http://dx.doi.org/10.1136/archdischild-2012-302724.0075.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Sieja, Amber, Eric Kim, Heather Holmstrom, Stephen Rotholz, Chen Tan Lin, Christine Gonzalez, Cortney Arellano, Sarah Hutchings, Denise Henderson e Katie Markley. "Multidisciplinary Sprint Program Achieved Specialty-Specific EHR Optimization in 20 Clinics". Applied Clinical Informatics 12, n.º 02 (março de 2021): 329–39. http://dx.doi.org/10.1055/s-0041-1728699.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Abstract Objective The objective of the study was to highlight and analyze the outcomes of software configuration requests received from Sprint, a comprehensive, clinic-centered electronic health record (EHR) optimization program. Methods A retrospective review of 1,254 Sprint workbook requests identified (1) the responsible EHR team, (2) the clinical efficiency gained from the request, and (3) the EHR intervention conducted. Results Requests were received from 407 clinicians and 538 staff over 31 weeks of Sprint. Sixty-nine percent of the requests were completed during the Sprint. Of all requests, 25% required net new build, 73% required technical investigation and/or solutions, and 2% of the requests were escalated to the vendor. The clinical specialty groups requested a higher percentage of items that earned them clinical review (16 vs. 10%) and documentation (29 vs. 23%) efficiencies compared with their primary care colleagues who requested slightly more order modifications (22 vs. 20%). Clinical efficiencies most commonly associated with workbook requests included documentation (28%), ordering (20%), in basket (17%), and clinical review (15%). Sprint user requests evaluated by ambulatory, hardware, security, and training teams comprised 80% of reported items. Discussion Sprint requests were categorized as clean-up, break-fix, workflow investigation, or new build. On-site collaboration with clinical care teams permitted consensus-building, drove vetting, and iteration of EHR build, and led to goal-driven, usable workflows and EHR products. Conclusion This program evaluation demonstrates the process by which optimization can occur and the products that result when we adhere to optimization principles in health care organizations.
11

Dresselhaus, Angela. "Literature of Acquisitions in Review, 2012–13". Library Resources & Technical Services 60, n.º 3 (28 de julho de 2016): 169. http://dx.doi.org/10.5860/lrts.60n3.169.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
The acquisitions literature published in 2012–13 shows a strong focus on nontraditional purchasing models, especially for electronic books (e-books). Patron Driven Acquisition (PDA) is one method that helps librarians cope with budget constraints that continue to plague library budgets. The expense of Big Deals has some libraries seeking more efficient alternatives such as Pay-Per-View or Evidence-Based Selection, however, many libraries are still reliant on the depth of coverage and perceived value of Big Deals. This review will cover these trends along with developments in Electronic Resources Management Systems (ERMSs), workflow efficiencies, and negotiation and licensing techniques.
12

Narayanan, Meyyammai, Xiao Zhou, Shawn J. Janarthanan, Mary Daniel, Maria Olmedo, Colleen Jernigan, Shreyaskumar Patel e Saroj Vadhan-Raj. "Increase in the patient wait-time and delays in the clinic workflow post-implementation of the electronic health record." Journal of Clinical Oncology 35, n.º 8_suppl (10 de março de 2017): 194. http://dx.doi.org/10.1200/jco.2017.35.8_suppl.194.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
194 Background: Growth in patient (pt) volume and limited clinic capacity can lead to long wait-times and pt/provider dissatisfaction. We have previously shown that the room pooling model, can reduce pt wait-time in the exam room, improve room utilization, and pt/providers satisfaction (ASCO 2016, Abstract 6595). One of the important goals of adopting electronic health records (EHR) is also to increase the clinical efficiencies, productivity and quality of care. The purpose of this study was to evaluate the impact of implementation of EHR on pt wait-time in the exam room and satisfaction in the Sarcoma Center. Methods: The time studies and pt and provider wait-time satisfaction surveys were carried out over 2 weeks prior to (baseline) and 6 months after the implementation of EHR. All times of when pts, mid-level providers, and doctors (MD) entered and exited the exam rooms were collected for a total sample size of 578 pts (300 before, 278 after) seen during the clinic hours and analyzed using JMP and SAS. Results: The proportion of pts seen within 30 minutes (Min) by MDs from the time pts roomed into exam room decreased by about 32% [from 53% (148/280) to 36% (94/259), p = 0.0001] post implementation of EHR. The median time for pts in the exam room waiting for MD increased (p = 0.0001) from 30 min (range: 0-126 min) to 40 min (range: 0-121 min). Although, the pt satisfaction did not significantly change [increase from 8% (23/278) to 12% (31/267) in the number of pts that were not satisfied to little-satisfied, and decrease from 92% (255/278) to 88% (236/267) in pts that were moderately to very-satisfied], the number of times MD had to wait for an open exam room increased from 8% (5/65) to 24% (14/59, p = 0.01). The delays to see MDs were associated with longer time spent with the nurse (from median 4 to 7 min), followed by delays in seeing Mid-level provider (from 11 to 18 min). Conclusions: These findings indicate that in the initial stages of implementation of EHR, the increase in pt wait-time and reduced clinical efficiencies can be related to the learning of and adapting to the new system. Attempts targeted to the areas of delays (such as training and redesigning workflow) may reduce the pt wait-time and improve the clinical efficiency.
13

Freihoefer, Kara, Len Kaiser, Dennis Vonasek e Sara Bayramzadeh. "Setting the Stage: A Comparative Analysis of an Onstage/Offstage and a Linear Clinic Modules". HERD: Health Environments Research & Design Journal 11, n.º 2 (27 de setembro de 2017): 89–103. http://dx.doi.org/10.1177/1937586717729348.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Objective: The purpose of this study was to understand how two different ambulatory design modules—traditional and onstage/offstage—impact operational efficiency, patient throughput, staff collaboration, and patient privacy. Background: Delivery of healthcare is greatly shifting to ambulatory settings because of rapid advancement of medicine and technology, resulting in more day procedures and follow-up care occurring outside of hospitals. It is anticipated that outpatient services will grow roughly 15–23% within the next 10 years (Sg2, 2014). Nonetheless, there is limited research that evaluates how the built environment impacts care delivery and patient outcomes. Method: This is a cross-sectional, comparative study consisted of a mixed-method approach that included shadowing clinic staff and observing and surveying patients. The linear module had shared corridors and publicly exposed workstations, whereas the onstage/offstage module separates patient/visitors from staff with dedicated patient corridors leading to exam rooms (onstage) and enclosed staff work cores (offstage). Roughly 35 hr of clinic staff shadowing and 55 hr of patient observations occurred. A total of 269 questionnaires were completed by patients/visitors. Results: The results demonstrate that the onstage/offstage module significantly improved staff workflow, reduced travel distances, increased communication in private areas, and significantly reduced patient throughput and wait times. However, patients’ perception of privacy did not change among the two modules. Conclusion: Compared to the linear module, this study provides evidence that the onstage/offstage module could have helped to optimize operational efficiencies, staff workflow, and patient throughput.
14

Hockey, Julie Michelle. "Transforming library enquiry services: anywhere, anytime, any device". Library Management 37, n.º 3 (14 de março de 2016): 125–35. http://dx.doi.org/10.1108/lm-04-2016-0021.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Purpose – The purpose this paper is to outline how the University of South Australia Library transformed its enquiry services by replacing fixed service desks with a blend of virtual and on demand services. Design/methodology/approach – Outlines the drivers for change, implementation approach and partnerships developed in order to change practices and use technology to deliver proactive services. Findings – The new model enables staffing and workflow efficiencies allowing the service to be delivered sustainably. It is anticipated that it will increase the Library’s visibility and accessibility in the physical and virtual environments and position the Library as an innovator in service delivery. Practical implications – The project involved significant change to traditional practices and challenged long held beliefs about library services. It required library staff to be supported and trained to develop new skills and adapt to new practices. Originality/value – Provides strategies and lessons learnt for other libraries considering similar changes to service delivery.
15

Menon, Rahul, e Christopher Rivett. "Time–motion analysis examining of the impact of Medic Bleep, an instant messaging platform, versus the traditional pager: a prospective pilot study". DIGITAL HEALTH 5 (janeiro de 2019): 205520761983181. http://dx.doi.org/10.1177/2055207619831812.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Objectives Efficient and accurate communication between healthcare professionals (HCPs) serves as the backbone to safe and efficient care delivery. Traditional pager-based interpersonal communication may contribute to inefficient communication practices and lapses in medical care. Methods This study aimed to examine the impact of Medic Bleep, a National Health Service (NHS) information governance-compliant instant messaging application, in an NHS Hospital Trust. We examined Medic Bleep’s impact on participant time and workflow using time–motion methodology. Cohorts of doctors and nurses using both Medic Bleep and the traditional pager were compared. Secondary endpoints of our study were to assess whether efficient communications could lead to better resource utilisation, patient safety as well as better quality of work life for the end user. Results Assimilation of Medic Bleep corresponded to a reduction in mean task-duration that was statistically significant ( p < 0.05) for To Take Out (TTO) and Patient Review categories. Nurses saved an average of 21 minutes per shift ( p < 0.05), whereas doctors saved an average of 48 minutes ( p < 0.05) per shift. Qualitative analysis suggested that HCPs benefited from better work prioritisation, collaboration and reduced medical errors enabled by an auditable communication workflow. Conclusion Medic Bleep reduced time spent on the tasks requiring interpersonal communication. Efficiencies were seen in Discharge Patient Flow, Patient Review and TTO categories. This improved HCP availability and response times to the benefit of patients. End users revealed that Medic Bleep had a positive effect on quality of work life.
16

Carroll, Noel, e Ita Richardson. "Mapping a Careflow Network to assess the connectedness of Connected Health". Health Informatics Journal 25, n.º 1 (24 de abril de 2017): 106–25. http://dx.doi.org/10.1177/1460458217702943.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Connected Health is an emerging and rapidly developing field which has the potential to transform healthcare service systems by increasing its safety, quality and overall efficiency. From a healthcare perspective, process improvement models have mainly focused on the static workflow viewpoint. The objective of this article is to study and model the dynamic nature of healthcare delivery, allowing us to identify where potential issues exist within the service system and to examine how Connected Health technological solutions may support service efficiencies. We explore the application of social network analysis (SNA) as a modelling technique which captures the dynamic nature of a healthcare service. We demonstrate how it can be used to map the ‘Careflow Network’ and guide Connected Health innovators to examine specific opportunities within the healthcare service. Our results indicate that healthcare technology must be correctly identified and implemented within the Careflow Network to enjoy improvements in service delivery. Oftentimes, prior to making the transformation to Connected Health, researchers use various modelling techniques that fail to identify where Connected Health innovation is best placed in a healthcare service network. Using SNA allows us to develop an understanding of the current operation of healthcare system within which they can effect change. It is important to identify and model the resource exchanges to ensure that the quality and safety of care are enhanced, efficiencies are increased and the overall healthcare service system is improved. We have shown that dynamic models allow us to study the exchange of resources. These are often intertwined within a socio-technical context in an informal manner and not accounted for in static models, yet capture a truer insight on the operations of a Careflow Network.
17

Lomotan, Edwin A., Ginny Meadows, Maria Michaels, Jeremy J. Michel e Kristen Miller. "To Share is Human! Advancing Evidence into Practice through a National Repository of Interoperable Clinical Decision Support". Applied Clinical Informatics 11, n.º 01 (janeiro de 2020): 112–21. http://dx.doi.org/10.1055/s-0040-1701253.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Abstract Background Healthcare systems devote substantial resources to the development of clinical decision support (CDS) largely independently. The process of translating evidence-based practice into useful and effective CDS may be more efficient and less duplicative if healthcare systems shared knowledge about the translation, including workflow considerations, key assumptions made during the translation process, and technical details. Objective Describe how a national repository of CDS can serve as a public resource for healthcare systems, academic researchers, and informaticists seeking to share and reuse CDS knowledge resources or “artifacts.” Methods In 2016, the Agency for Healthcare Research and Quality (AHRQ) launched CDS Connect as a public, web-based platform for authoring and sharing CDS knowledge artifacts. Researchers evaluated early use and impact of the platform by collecting user experiences of AHRQ-sponsored and community-led dissemination efforts and through quantitative/qualitative analysis of site metrics. Efforts are ongoing to quantify efficiencies gained by healthcare systems that leverage shared, interoperable CDS artifacts rather than developing similar CDS de novo and in isolation. Results Federal agencies, academic institutions, and others have contributed over 50 entries to CDS Connect for sharing and dissemination. Analysis indicates shareable CDS resources reduce team sizes and the number of tasks and time required to design, develop, and deploy CDS. However, the platform needs further optimization to address sociotechnical challenges. Benefits of sharing include inspiring others to undertake similar CDS projects, identifying external collaborators, and improving CDS artifacts as a result of feedback. Organizations are adapting content available through the platform for continued research, innovation, and local implementations. Conclusion CDS Connect has provided a functional platform where CDS developers are actively sharing their work. CDS sharing may lead to improved implementation efficiency through numerous pathways, and further research is ongoing to quantify efficiencies gained.
18

Freidel, Matthew R., e Roger S. Armen. "Mapping major SARS-CoV-2 drug targets and assessment of druggability using computational fragment screening: Identification of an allosteric small-molecule binding site on the Nsp13 helicase". PLOS ONE 16, n.º 2 (17 de fevereiro de 2021): e0246181. http://dx.doi.org/10.1371/journal.pone.0246181.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
The 2019 emergence of, SARS-CoV-2 has tragically taken an immense toll on human life and far reaching impacts on society. There is a need to identify effective antivirals with diverse mechanisms of action in order to accelerate preclinical development. This study focused on five of the most established drug target proteins for direct acting small molecule antivirals: Nsp5 Main Protease, Nsp12 RNA-dependent RNA polymerase, Nsp13 Helicase, Nsp16 2’-O methyltransferase and the S2 subunit of the Spike protein. A workflow of solvent mapping and free energy calculations was used to identify and characterize favorable small-molecule binding sites for an aromatic pharmacophore (benzene). After identifying the most favorable sites, calculated ligand efficiencies were compared utilizing computational fragment screening. The most favorable sites overall were located on Nsp12 and Nsp16, whereas the most favorable sites for Nsp13 and S2 Spike had comparatively lower ligand efficiencies relative to Nsp12 and Nsp16. Utilizing fragment screening on numerous possible sites on Nsp13 helicase, we identified a favorable allosteric site on the N-terminal zinc binding domain (ZBD) that may be amenable to virtual or biophysical fragment screening efforts. Recent structural studies of the Nsp12:Nsp13 replication-transcription complex experimentally corroborates ligand binding at this site, which is revealed to be a functional Nsp8:Nsp13 protein-protein interaction site in the complex. Detailed structural analysis of Nsp13 ZBD conformations show the role of induced-fit flexibility in this ligand binding site and identify which conformational states are associated with efficient ligand binding. We hope that this map of over 200 possible small-molecule binding sites for these drug targets may be of use for ongoing discovery, design, and drug repurposing efforts. This information may be used to prioritize screening efforts or aid in the process of deciphering how a screening hit may bind to a specific target protein.
19

Bodini, Margherita, Chiara Ronchini, Luciano Giacò, Anna Russo, Giorgio E. M. Melloni, Lucilla Luzi, Domenico Sardella et al. "The hidden genomic landscape of acute myeloid leukemia: subclonal structure revealed by undetected mutations". Blood 125, n.º 4 (22 de janeiro de 2015): 600–605. http://dx.doi.org/10.1182/blood-2014-05-576157.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Abstract The analyses carried out using 2 different bioinformatics pipelines (SomaticSniper and MuTect) on the same set of genomic data from 133 acute myeloid leukemia (AML) patients, sequenced inside the Cancer Genome Atlas project, gave discrepant results. We subsequently tested these 2 variant-calling pipelines on 20 leukemia samples from our series (19 primary AMLs and 1 secondary AML). By validating many of the predicted somatic variants (variant allele frequencies ranging from 100% to 5%), we observed significantly different calling efficiencies. In particular, despite relatively high specificity, sensitivity was poor in both pipelines resulting in a high rate of false negatives. Our findings raise the possibility that landscapes of AML genomes might be more complex than previously reported and characterized by the presence of hundreds of genes mutated at low variant allele frequency, suggesting that the application of genome sequencing to the clinic requires a careful and critical evaluation. We think that improvements in technology and workflow standardization, through the generation of clear experimental and bioinformatics guidelines, are fundamental to translate the use of next-generation sequencing from research to the clinic and to transform genomic information into better diagnosis and outcomes for the patient.
20

Kim, Shaun S. H., Dushmanta Dutta, Chas A. Egan, Juernjakob Dugge, Ramneek Singh, Geoff P. Davis e Joel M. Rahman. "Custom functionality and integrative approaches for hydrological modelling tools for water resources planning and management". Journal of Hydroinformatics 17, n.º 1 (11 de julho de 2014): 75–89. http://dx.doi.org/10.2166/hydro.2014.125.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
This paper outlines the application and usefulness of a software platform that enables hydrologists to develop custom functionality in a new hydrological modelling tool, eWater Source, designed for water resources planning and management. The flexible architecture of the software allows incorporation of third-party components as plug-ins to add new capabilities that are not built in. Plug-ins can be developed to adapt the software to suit the needs of hydrologists with modest software development knowledge. This can result in an improvement in workflow and efficiencies. In addition, modellers can use plug-ins to integrate hydrological process and management models that may not be able to be built in the normal tool. The paper introduces the plug-ins functionality of the modelling tool, its design and applications with three example plug-ins to demonstrate. These are: (1) a data processing plug-in to upscale urban environment models; (2) a management rule plug-in to calculate loss allowances for the Pioneer Valley; and (3) a model plug-in to integrate into a river system model. For planning purposes, the use of plug-ins is thought to be critical for modelling management rules for various jurisdictions since these can vary significantly between jurisdictions and change over time.
21

Källsten, Malin, Rafael Hartmann, Lucia Kovac, Fredrik Lehmann, Sara Bergström Lind e Jonas Bergquist. "Investigating the Impact of Sample Preparation on Mass Spectrometry-Based Drug-To-Antibody Ratio Determination for Cysteine- and Lysine-Linked Antibody–Drug Conjugates". Antibodies 9, n.º 3 (8 de setembro de 2020): 46. http://dx.doi.org/10.3390/antib9030046.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Antibody–drug conjugates (ADCs) are heterogeneous biotherapeutics and differ vastly in their physicochemical properties depending on their design. The number of small drug molecules covalently attached to each antibody molecule is commonly referred to as the drug-to-antibody ratio (DAR). Established analytical protocols for mass spectrometry (MS)-investigation of antibodies and ADCs often require sample treatment such as desalting or interchain disulfide bond reduction prior to analysis. Herein, the impact of the desalting and reduction steps—as well as the sample concentration and elapsed time between synthesis and analysis of DAR-values (as acquired by reversed phase liquid chromatography MS (RPLC–MS))—was investigated. It was found that the apparent DAR-values could fluctuate by up to 0.6 DAR units due to changes in the sample preparation workflow. For methods involving disulfide reduction by means of dithiothreitol (DTT), an acidic quench is recommended in order to increase DAR reliability. Furthermore, the addition of a desalting step was shown to benefit the ionization efficiencies in RPLC–MS. Finally, in the case of delayed analyses, samples can be stored at four degrees Celsius for up to one week but are better stored at −20 °C for longer periods of time. In conclusion, the results demonstrate that commonly used sample preparation procedures and storage conditions themselves may impact MS-derived DAR-values, which should be taken into account when evaluating analytical procedures.
22

Cheng, Daryl R., e Mike South. "Electronic Task Management System: A Pediatric Institution's Experience". Applied Clinical Informatics 11, n.º 05 (outubro de 2020): 839–45. http://dx.doi.org/10.1055/s-0040-1721321.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Abstract Background Electronic medical task management systems (ETMs) have been adopted in health care institutions to improve health care provider communication. ETMs allow for the requesting and resolution of nonurgent tasks between clinicians of all craft groups. Visibility, ability to provide close-loop feedback, and a digital trail of all decisions and responsible clinicians are key features of ETMs. An embedded ETM within an integrated electronic health record (EHR) was introduced to the Royal Children's Hospital Melbourne on April 30, 2016. The ETM is used hospital-wide for nonurgent tasks 24 hours a day. It facilitates communication of nonurgent tasks between clinical staff, with an associated designated timeframe in which the task needs to be completed (2, 4, and 8 hours). Objective This study aims to examine the usage of the ETM at our institution since its inception. Methods ETM usage data from the first 3 years of use (April 2016 to April 2019) were extracted from the EHR. Data collected included age of patient, date and time of task request, ward, unit, type of task, urgency of task, requestor role, and time to completion. Results A total of 136,481 tasks were placed via the ETM in the study period. There were approximately 125 tasks placed each day (24-hour period). The most common time of task placement was around 6:00 p.m. Task placement peaked at approximately 8 a.m., 2 p.m., and 9 p.m.—consistent with nursing shift change times. In total, 63.16% of tasks were placed outside business hours, indicating predominant usage for after-hours task communication. The ETM was most highly utilized by surgical units. The majority of tasks were ordered by nurses for medical staff to complete (97.01%). A significant proportion (98.79%) of tasks was marked as complete on the ETM, indicating closed-loop feedback after tasks were requested. Conclusion An ETM function embedded in our EHR has been highly utilized in our institution since its introduction. It has multiple benefits for the clinician in the form of efficiencies in workflow and improvement in communication and also workflow management. By allowing collection, tracking, audit, and prioritization of tasks, it also provides a stream of actionable data for quality-improvement activities.
23

Lo Sciuto, Grazia. "Application of Artificial Intelligence for Optimization of Organic Solar Cells Production Process". Photonics Letters of Poland 12, n.º 2 (1 de julho de 2020): 34. http://dx.doi.org/10.4302/plp.v12i2.993.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
The study of organic solar cells (OSCs) has been rapidly developed in recent years. Organic solar cell technology is sought after mainly due to the ease of manufacture and their exclusive properties such as mechanical flexibility, light-weight, and transparency. These properties of OSCs are well-suited for unconventional applications with power conversion efficiencies more high than 10%. The flexibility of the used substrates and the thinness of the devices make OSCs ideal for roll-to-roll production. However the organic solar cells still have very low conversion efficiencies due to degradation and stability of the technology. In order to extract their full potential, OSCs have to be optimized. On the other hand the production chain of the organic solar cells (OSC) can take advantage of the use of artificial intelligence (AI). In fact the integration into the production workflow makes solar cells more competitive and efficient. This paper presents some applications of the AI for optimization of OSCs production processes Full Text: PDF ReferencesLo Sciuto, G., Capizzi, G., Coco, S., Shikler, R., "Geometric shape optimization of organic solar cells for efficiency enhancement by neural networks." (2017) Lecture Notes in Mechanical Engineering, pp. 789-796. CrossRef Barnea, S.N., Lo Sciuto, G., Hai, N., Shikler, R., Capizzi, G., Wozniak, M., Polap, D., "Photo-electro characterization and modeling of organic light-emitting diodes by using a radial basis neural network." (2017) Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 10246 LNAI, pp. 378-389. CrossRef Ye, L.; Hu, H.; Ghasemi, M.; Wang, T.; Collins, B.A.; Kim, J.H.; Jiang, K.; Carpenter, J.H.; Li, H.; Li, Z.; et al. "Quantitative relations between interaction parameter, miscibility and function in organic solar cells." Nat. Mater. 2018, 17, 253-260. CrossRef Haralick, R.M., Shanmugam, K., Dinstein, I.: Textural features for image classification. IEEE Trans. Syst. Man Cybern. SMC-3(6), 610-621 (1973) CrossRef Capizzi, G., Sciuto, G.L., Napoli, C., Tramontana, E., Wozniak, M.: Automatic classification of fruit defects based on co-occurrence matrix and neural networks. In: 2015 Federated Conference on Computer Science and Information Systems (FedCSIS), pp. 861-867, September 2015. CrossRef
24

Daniotti, Bruno, Cecilia Maria Bolognesi, Sonia Lupica Spagnolo, Alberto Pavan, Martina Signorini, Simone Ciuffreda, Claudio Mirarchi et al. "An Interoperable BIM-Based Toolkit for Efficient Renovation in Buildings". Buildings 11, n.º 7 (25 de junho de 2021): 271. http://dx.doi.org/10.3390/buildings11070271.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Since the buildings and construction sector is one of the main areas responsible for energy consumption and emissions, focusing on their refurbishment and promoting actions in this direction will be helpful to achieve an EU Agenda objective of making Europe climate-neutral by 2050. One step towards the renovation action is the exploitation of digital tools into a BIM framework. The scope of the research contained in this paper is to improve the management of information throughout the different stages of the renovation process, allowing an interoperable exchange of data among the involved stakeholders; the development of an innovative BIM-based toolkit is the answer to the research question. The research and results obtained related with the development of an interoperable BIM-based toolkit for efficient renovation in buildings in the framework of the European research project BIM4EEB. Specifically, the developed BIM management system allows the exchange of the data among the different tools, using open interoperable formats (as IFC) and linked data, in a Common Data Environment, to be used by the different stakeholders. Additionally, the developed tools allow the stakeholders to manage different stages of the renovation process, facilitating efficiencies in terms of time reduction and improving the resulting quality. The validity of each tool with respect to existing practices is demonstrated here, and the strengths and weaknesses of the proposed tools are described in the workflow detailing issues such as interoperability, collaboration, integration of different solutions, and time consuming existing survey processes.
25

Calder-Sprackman, S., G. Clapham, T. Kandiah, J. Choo-Foo, S. Aggarwal, J. Sweet, K. Abdulkarim, C. Price, V. Thiruganasambandamoorthy e E. Kwok. "MP02: The impact of adoption of an electronic health record on emergency physician work: a time motion study". CJEM 22, S1 (maio de 2020): S42—S43. http://dx.doi.org/10.1017/cem.2020.150.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Introduction: Adoption of a new Electronic Health Record (EHR) can introduce radical changes in task allocation, work processes, and efficiency for providers. In June 2019, The Ottawa Hospital transitioned from a primarily paper based EHR to a comprehensive EHR (Epic) using a “big bang” approach. The objective of this study was to assess the impact of the transition to Epic on Emergency Physician (EP) work activities in a tertiary care academic Emergency Department (ED). Methods: We conducted a time motion study of EPs on shift in low acuity areas of our ED (CTAS 3-5). Fifteen EPs representing a spectrum of pre-Epic baseline workflow efficiencies were directly observed in real-time during two 4-hour sessions prior to EHR implementation (May 2019) and again in go live (August 2019). Trained observers performed continuous observation and measured times for the following EP tasks: chart review, direct patient care, documentation, physical movement, communication, teaching, handover, and other (including breaks). We compared time spent on tasks pre Epic and during go live and report mean times for the EP tasks per patient and per shift using two tailed t-test for comparison. Results: All physicians had a 17% decrease in patients seen after Epic implementation (2.72/hr vs 2.24/hr, p < 0.01). EPs spent the same amount of time per patient on direct patient care and chart review (direct patient care: 9min06sec/pt pre vs 8min56sec/pt go live, p = 0.77; chart review: 2min47sec/pt pre vs 2min50sec/pt go live, p = 0.88), however, documentation time increased (5min28sec/pt pre vs 7min12sec/pt go live, p < 0.01). Time spent on shift teaching learners increased but did not reach statistical significance (31min26sec/shift pre vs 36min21sec/shift go live, p = 0.39), and time spent on non-patient-specific activities – physical movement, handover, team communication, and other – did not change (50min49sec/shift pre vs 50min53sec/shift go live, p = 0.99). Conclusion: Implementation of Epic did not affect EP time with individual patients - there was no change in direct patient care or chart review. Documentation time increased and EP efficiency (patients seen per hr on shift) decreased after go live. Patient volumes cannot be adjusted in the ED therefore anticipating the EHR impact on EP workflow is critical for successful implementation. EDs may consider up staffing 20% during go live. Findings from this study can inform how to best support EDs nationally through transition to EHR.
26

Conway, Nicholas, Karen A. Adamson, Scott G. Cunningham, Alistair Emslie Smith, Peter Nyberg, Blair H. Smith, Ann Wales e Deborah J. Wake. "Decision Support for Diabetes in Scotland: Implementation and Evaluation of a Clinical Decision Support System". Journal of Diabetes Science and Technology 12, n.º 2 (14 de setembro de 2017): 381–88. http://dx.doi.org/10.1177/1932296817729489.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Background: Automated clinical decision support systems (CDSS) are associated with improvements in health care delivery to those with long-term conditions, including diabetes. A CDSS was introduced to two Scottish regions (combined diabetes population ~30 000) via a national diabetes electronic health record. This study aims to describe users’ reactions to the CDSS and to quantify impact on clinical processes and outcomes over two improvement cycles: December 2013 to February 2014 and August 2014 to November 2014. Methods: Feedback was sought via patient questionnaires, health care professional (HCP) focus groups, and questionnaires. Multivariable regression was used to analyze HCP SCI-Diabetes usage (with respect to CDSS message presence/absence) and case-control comparison of clinical processes/outcomes. Cases were patients whose HCP received a CDSS messages during the study period. Closely matched controls were selected from regions outside the study, following similar clinical practice (without CDSS). Clinical process measures were screening rates for diabetes-related complications. Clinical outcomes included HbA1c at 1 year. Results: The CDSS had no adverse impact on consultations. HCPs were generally positive toward CDSS and used it within normal clinical workflow. CDSS messages were generated for 5692 cases, matched to 10 667 controls. Following clinic, the probability of patients being appropriately screened for complications more than doubled for most measures. Mean HbA1c improved in cases and controls but more so in cases (–2.3 mmol/mol [–0.2%] versus –1.1 [–0.1%], P = .003). Discussion and Conclusions: The CDSS was well received; associated with improved efficiencies in working practices; and large improvements in guideline adherence. These evidence-based, early interventions can significantly reduce costly and devastating complications.
27

Lu, Shuning, Shicun Huang, Zhiqiang Pan, Huawu Deng, David Stanley e Yubin Xin. "HIGH PERFORMANCE COMPUTING FOR DSM EXTRACTION FROM ZY-3 TRI-STEREO IMAGERY". ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences III-1 (2 de junho de 2016): 113–20. http://dx.doi.org/10.5194/isprsannals-iii-1-113-2016.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
ZY-3 has been acquiring high quality imagery since its launch in 2012 and its tri-stereo (three-view or three-line-array) imagery has become one of the top choices for extracting DSM (Digital Surface Model) products in China over the past few years. The ZY-3 tri-stereo sensors offer users the ability to capture imagery over large regions including an entire territory of a country, such as China, resulting in a large volume of ZY-3 tri-stereo scenes which require timely (e.g., near real time) processing, something that is not currently possible using traditional photogrammetry workstations. This paper presents a high performance computing solution which can efficiently and automatically extract DSM products from ZY-3 tri-stereo imagery. The high performance computing solution leverages certain parallel computing technologies to accelerate computation within an individual scene and then deploys a distributed computing technology to increase the overall data throughput in a robust and efficient manner. By taking advantage of the inherent efficiencies within the high performance computing environment, the DSM extraction process can exploit all combinations offered from a set of tri-stereo images (forward-backword, forward-nadir and backword-nadir). The DSM results merged from all of the potential combinations can minimize blunders (e.g., incorrect matches) and also offer the ability to remove potential occlusions which may exist in a single stereo pair, resulting in improved accuracy and quality versus those that are not merged. Accelerated performance is inherent within each of the individual steps of the DSM extraction workflow, including the collection of ground control points and tie points, image bundle adjustment, the creation of epipolar images, and computing elevations. Preliminary experiments over a large area in China have proven that the high performance computing system can generate high quality and accurate DSM products in a rapid manner.
28

Dreier, Matthias, Hélène Berthoud, Noam Shani, Daniel Wechsler e Pilar Junier. "SpeciesPrimer: a bioinformatics pipeline dedicated to the design of qPCR primers for the quantification of bacterial species". PeerJ 8 (18 de fevereiro de 2020): e8544. http://dx.doi.org/10.7717/peerj.8544.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Background Quantitative real-time PCR (qPCR) is a well-established method for detecting and quantifying bacteria, and it is progressively replacing culture-based diagnostic methods in food microbiology. High-throughput qPCR using microfluidics brings further advantages by providing faster results, decreasing the costs per sample and reducing errors due to automatic distribution of samples and reagents. In order to develop a high-throughput qPCR approach for the rapid and cost-efficient quantification of microbial species in complex systems such as fermented foods (for instance, cheese), the preliminary setup of qPCR assays working efficiently under identical PCR conditions is required. Identification of target-specific nucleotide sequences and design of specific primers are the most challenging steps in this process. To date, most available tools for primer design require either laborious manual manipulation or high-performance computing systems. Results We developed the SpeciesPrimer pipeline for automated high-throughput screening of species-specific target regions and the design of dedicated primers. Using SpeciesPrimer, specific primers were designed for four bacterial species of importance in cheese quality control, namely Enterococcus faecium, Enterococcus faecalis, Pediococcus acidilactici and Pediococcus pentosaceus. Selected primers were first evaluated in silico and subsequently in vitro using DNA from pure cultures of a variety of strains found in dairy products. Specific qPCR assays were developed and validated, satisfying the criteria of inclusivity, exclusivity and amplification efficiencies. Conclusion In this work, we present the SpeciesPrimer pipeline, a tool to design species-specific primers for the detection and quantification of bacterial species. We use SpeciesPrimer to design qPCR assays for four bacterial species and describe a workflow to evaluate the designed primers. SpeciesPrimer facilitates efficient primer design for species-specific quantification, paving the way for a fast and accurate quantitative investigation of microbial communities.
29

Lu, Shuning, Shicun Huang, Zhiqiang Pan, Huawu Deng, David Stanley e Yubin Xin. "HIGH PERFORMANCE COMPUTING FOR DSM EXTRACTION FROM ZY-3 TRI-STEREO IMAGERY". ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences III-1 (2 de junho de 2016): 113–20. http://dx.doi.org/10.5194/isprs-annals-iii-1-113-2016.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
ZY-3 has been acquiring high quality imagery since its launch in 2012 and its tri-stereo (three-view or three-line-array) imagery has become one of the top choices for extracting DSM (Digital Surface Model) products in China over the past few years. The ZY-3 tri-stereo sensors offer users the ability to capture imagery over large regions including an entire territory of a country, such as China, resulting in a large volume of ZY-3 tri-stereo scenes which require timely (e.g., near real time) processing, something that is not currently possible using traditional photogrammetry workstations. This paper presents a high performance computing solution which can efficiently and automatically extract DSM products from ZY-3 tri-stereo imagery. The high performance computing solution leverages certain parallel computing technologies to accelerate computation within an individual scene and then deploys a distributed computing technology to increase the overall data throughput in a robust and efficient manner. By taking advantage of the inherent efficiencies within the high performance computing environment, the DSM extraction process can exploit all combinations offered from a set of tri-stereo images (forward-backword, forward-nadir and backword-nadir). The DSM results merged from all of the potential combinations can minimize blunders (e.g., incorrect matches) and also offer the ability to remove potential occlusions which may exist in a single stereo pair, resulting in improved accuracy and quality versus those that are not merged. Accelerated performance is inherent within each of the individual steps of the DSM extraction workflow, including the collection of ground control points and tie points, image bundle adjustment, the creation of epipolar images, and computing elevations. Preliminary experiments over a large area in China have proven that the high performance computing system can generate high quality and accurate DSM products in a rapid manner.
30

Mishra, Ashish, e J. Mark Tuthill. "Implementation of Whole-Slide Imaging as a Pathology Teaching Tool and for Institutional Tumor Boards: A Resident’s Experience". American Journal of Clinical Pathology 152, Supplement_1 (11 de setembro de 2019): S123. http://dx.doi.org/10.1093/ajcp/aqz123.002.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Abstract Objectives This presentation will describe our experience implementing and utilizing whole-slide imaging (WSI) as a teaching tool for the pathology residents in Henry Ford Hospital, Detroit, as well as our initial efforts to use WSI at institutional tumor boards. Methods Glass slides were scanned for practice over several weeks to determine basic operation, system performance, and workflow processes. Experience quickly showed that the scanner could be used to improvement quality and efficiency of weekly unknown slide conference. Informatics lecture and luncheon meeting topics as well as a grand-rounds presentation on novel ways to use WSI were shared with residents and other members of the department. This resulted in marked increased interest. Soon interest grew from attending physicians to use WSI for a subset of tumor boards. The same processes and procedures used for scanning slides for unknown conference were applied. Results In October 2016, an unknown slide conference was presented using WSI. The reaction to the quality of the histopathology system usage was excellent: nuclear contours and nucleoli were clear; navigation was easy; response time was excellent, with no screen lag. Residents and attending loved the new format. Since then, the unknown conference has been presented monthly using WSI. In November 2016, we started presenting cases on WSI in the GYN tumor board. Some had no idea that this was even technically possible. All GYN weekly tumor boards are now presented using WSI. Conclusion Whole-slide imaging is a useful tool for teaching and presentation purposes. It can be easily implemented and integrated into our day-to-day pathology practice and resident training. The reluctance to use WSI is initially high among pathologists, but enthusiasm increases once implemented into regular practice. WSI provides for efficiencies and ease of collaboration in both educational and clinical case review settings such as institutional tumor boards.
31

Kurwi, Sahar, Peter Demian, Karen B. Blay e Tarek M. Hassan. "Collaboration through Integrated BIM and GIS for the Design Process in Rail Projects: Formalising the Requirements". Infrastructures 6, n.º 4 (30 de março de 2021): 52. http://dx.doi.org/10.3390/infrastructures6040052.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Many of the obstacles to effective delivery of rail projects (in terms of cost, time and quality) can be traced back to poor collaboration across complex design teams and supply chains. As in any infrastructure delivery process, it is important to make decisions collaboratively at an early design stage. Advanced systems such as Building Information Modelling (BIM) and Geographic Information Systems (GIS) can facilitate collaboration during the decision-making process and boost work efficiencies. Such potential benefits are not realised because the roles of BIM and GIS in facilitating collaboration are not clearly understood or articulated. This paper aims to identify and articulate collaboration requirements during the design stage of rail projects. To achieve this, a mixed-method approach was employed to examine the issues that hinder collaboration in rail projects. An online questionnaire was designed to assess the state-of-art in BIM and GIS, followed by fifteen follow-up face to face interviews with experts to identify collaboration issues and suggestions to overcome them. The research identified the main challenges to effective collaboration and provided suggestions to overcome them. The main challenges were managing information and a reluctance to use new collaboration technologies. The main solution which emerged from the data was to develop an original Collaborative Plan of Work (CPW). The developed CPW is tailored to rail projects and has been formulated by combining the RIBA (Royal Institute of British Architects) Plan of Work and the GRIP Stages (Governance for Railway Investment Projects). This comprehensive plan of work, which is uniquely collaboration-focused, is significant because it can be further developed to formulate a precise process model for collaboration during the design process of rail projects. Such a process can (for example) be configured into the workflow prescribed by a Common Data Environment.
32

Roberson, Janie, Allison Wrenn, John Poole, Andrew Jaeger e Isam A. Eltoum. "Constructing a modern cytology laboratory: A toolkit for planning and design". CytoJournal 10 (28 de fevereiro de 2013): 3. http://dx.doi.org/10.4103/1742-6413.107983.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Introduction: Constructing or renovating a laboratory can be both challenging and rewarding. UAB Cytology (UAB CY) recently undertook a project to relocate from a building constructed in 1928 to new space. UAB CY is part of an academic center that provides service to a large set of patients, support training of one cytotechnology program and one cytopathology fellowship training program and involve actively in research and scholarly activity. Our objectives were to provide a safe, aesthetically pleasing space and gain efficiencies through lean processes. Methods: The phases of any laboratory design project are Planning, Schematic Design (SD), Design Development (DD), Construction Documents (CD) and Construction. Lab personnel are most critical in the Planning phase. During this time stakeholders, relationships, budget, square footage and equipment were identified. Equipment lists, including what would be relocated, purchased new and projected for future growth ensure that utilities were matched to expected need. A chemical inventory was prepared and adequate storage space was planned. Regulatory and safety requirements were discussed. Tours and high level process flow diagrams helped architects and engineers understand the laboratory daily work. Future needs were addressed through a questionnaire which identified potential areas of growth and technological change. Throughout the project, decisions were driven by data from the planning phase. During the SD phase, objective information from the first phase was used by architects and planners to create a general floor plan. This was the basis of a series of meetings to brainstorm and suggest modifications. DD brings more detail to the plans with engineering, casework, equipment specifics, finishes. Design changes should be completed at this phase. The next phase, CD took the project from the lab purview into purely technical mode. Construction documents were used by the contractor for the bidding process and ultimately the Construction phase. Results: The project fitted out a total of 9,000 square feet; 4,000 laboratory and 5,000 office/support. Lab space includes areas for Prep, CT screening, sign out and Imaging. Adjacent space houses faculty offices and conferencing facilities. Transportation time was reduced (waste removal) by a Pneumatic Tube System, specimen drop window to Prep Lab and a pass thru window to the screening area. Open screening and prep areas allow visual management control. Efficiencies were gained by ergonomically placing CT Manual and Imaging microscopes and computers in close proximity, also facilitating a paperless workflow for additional savings. Logistically, closer proximity to Surgical Pathology maximized the natural synergies between the areas. Conclusions: Lab construction should be a systematic process based on sound principles for safety, high quality testing, and finance. Our detailed planning and design process can be a model for others undertaking similar projects
33

Davidson, John K. "Plate tectonic structural geology to detailed field and prospect stress prediction". APPEA Journal 48, n.º 1 (2008): 153. http://dx.doi.org/10.1071/aj07010.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Arguably the first successful application of the theory of continental drift to petroleum exploration was in 1959 by the pioneers S. W. Carey and L. G. Weeks whose collaboration led to the discovery of the world class Gippsland Basin. Plate tectonics, as the theory is now known, was still nascent and not prominent during peak global oil exploration success in the 1960s. As discovery rates continue to decline, large scale description of separating and colliding continents has become increasingly impotent in the ever more complex hunt for the next barrel. Emphasis is turning from new basins and plays to smaller intra-basin discoveries related to a more detailed understanding of basin forming faults and their local stress effects on traps and trap geometries. Improved oil recovery is not only about finding new fields, but also demands detailed stress information for horizontal wellbore stability to economically and effectively increase reserves and recovery rates by extracting new oil from old fields. As a result, expensive wellbore based measurements have been deployed in the past 15 years. These precision measurements have then been averaged between wells for stress prediction but stress directions are known to vary abruptly by up to 90° over distances of less than 3 km. A solution lies in the seismic recognition of globally synchronous compressional pulses which, like a heartbeat, have added predictability of stress fields hence to stress analysis. This repetition of stress provides a workflow for stress consistent seismic interpretation that can predict horizontal and vertical changes in the direction of the maximum horizontal compressional component of a stress SH (SHD) and also in the magnitude of the stress, SHM. It is now possible to derive pre-drill at any desired point, important exploration and production variables such as stress related fault seal and open fracture orientation. Similarly, important reservoir development parameters such as fracture gradients and wellbore stability prediction will maximise recovery efficiencies and reduce development costs. This technique will also aid in effective carbon dioxide sequestration, a challenging new field of endeavour.
34

Vadhan-Raj, Saroj, Xiao Zhou, Meyyammai Narayanan, Shawn J. Janarthanan, Mary Daniel, Colleen Jernigan e Shreyaskumar Patel. "Impact of room pooling and electronic health record on patient (pt) wait time, clinic work flow, and pts’/providers’ satisfaction." Journal of Clinical Oncology 35, n.º 15_suppl (20 de maio de 2017): e18191-e18191. http://dx.doi.org/10.1200/jco.2017.35.15_suppl.e18191.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
e18191 Background: Excessive pt wait time can have negative effect on clinic work flow and on pts/ providers satisfaction. Increasing pt volume and limited clinic capacity can lead to long wait times for pts. The purpose of this two-part study was to evaluate the impact of Room Pooling Model (RPM) instead of Room Allocation Model (Part 1) and Electronic Health Record (EHR) on pt wait times in clinic and pts’/providers’ satisfaction (Part 2). Methods: The time studies and pts’/providers’ wait time satisfaction surveys were carried out over 2 weeks before (baseline) and 8 weeks after the implementation of RPM (Part 1), prior to the new EHR system, and 6 months after the implementation of EHR (part 2). All times of when pts, mid-level providers (MLP), and doctors (MDs) entered and exited the exam rooms were collected for 887 pts seen during the clinic. Data was analyzed using JMP and SAS. Results: As described earlier (ASCO 2016, abst 6595), the RPM was associated with increase in the proportion of pts seen by MDs within 30 min from the time roomed in the exam room and improvement in pts’/provider’s satisfaction. Post EHR, there were delays with decrease in the proportion of pts seen within 30 min from the time roomed in. Although the pt satisfaction did not change significantly, the number of times MDs had to wait for an open exam room increased from 8% (5/65) to 24% (14/59, p=0.01). The impact of RPM and EHR on pt times are shown below. The delays to see MDs after EHR were associated with longer time spent with the nurse (from median 4 to 7 min) and delays in seeing MLPs (from 11 to 18 min). Conclusions: These findings indicate that RPM can improve pt wait times. During initial stages of EHR implementation, the increase in pt wait time and reduced clinical efficiencies can be related to learning, and adapting to the new system. These data can be useful to design interventions that can target the areas of delays such as training and redesigning workflow to improve the clinical efficiency. [Table: see text]
35

Novak Lauscher, H., E. Stacy, J. Christenson, B. Clifford, F. Flood, D. Horvat, R. Markham, J. Pawlovich, P. Rowe e K. Ho. "MP34: Evaluation of real-time virtual support for rural emergency care". CJEM 20, S1 (maio de 2018): S53. http://dx.doi.org/10.1017/cem.2018.188.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Introduction: In many rural and remote communities in BC, family physicians who are providing excellent primary and emergency care would like to access useful, timely, and collegial support to ensure the highest quality of health services for their patients. We undertook a real-time virtual support project in Robson Valley, located in northern BC, to evaluate the use of digital technologies such as videoconferencing for on demand consultation between family physicians at rural sites and emergency physicians at a regional site. Telehealth consults also occurred between rural sites with nurses at community emergency rooms consulting with local on-call physicians. Our aim was to use telehealth to facilitate timely access to high quality, comprehensive, coordinated team-based care. An evaluation framework, based on the Triple Aim sought to: 1) Identify telehealth use cases and assess impact on patient outcomes, patient and health professional experience, and cost of health care delivery; and 2) Assess the role of relationships among care team members in progressing from uptake to normalization of telehealth into routine usage. Methods: Using a participatory approach, all members of the pilot project were involved in shaping the pilot including the co-development of the evaluation itself. Evaluation was used iteratively throughout implementation for ongoing quality improvement via regular team meetings, sharing and reflecting on findings, and adjusting processes as required. Mixed methods were used including: interviews with family physicians, nurses, and patients at rural sites, and emergency physicians at regional site; review of records such as technology use statistics; and stakeholder focus groups. Results: From November 2016 to July 2017, 26 cases of telehealth use were captured and evaluated. Findings indicate that telehealth has positively impacted care team, patients, and health system. Benefits for care team at the rural sites included confidence in diagnoses through timely access to advice and support, while emergency physicians at the regional site gained deeper understanding of the practice settings of rural colleagues. Nevertheless, telehealth has complicated the emergency department work flow and increased physician workload. Findings demonstrated efficiencies for the health system, including reducing the need for patient transfer. Patients expressed confidence in the physicians and telehealth system; by receiving care closer to home, they experienced personal cost savings. Implementation saw a move away from scheduled telehealth visits to real use of technology for timely access. Conclusion: Evidence of the benefits of telehealth in emergency settings is needed to support stakeholder engagement to address issues of workflow and capacity. This pilot has early indications of significant local impact and will inform the expansion of emergency telehealth in all emergency settings in BC.
36

Byers, Carl, e Andrew Woo. "3D data visualization: The advantages of volume graphics and big data to support geologic interpretation". Interpretation 3, n.º 3 (1 de agosto de 2015): SX29—SX39. http://dx.doi.org/10.1190/int-2014-0257.1.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
The ability to integrate diverse data types from multiple live and simulated sources, manipulate them dynamically, and deploy them in integrated, visual formats and in mobile settings provides significant advantages. We have reviewed some of the benefits of volume graphics and the use of big data in the context of 3D visualization case studies, in which inherent features, such as representation efficiencies, dynamic modifications, cross sectioning, and others, could improve interpretation processes and workflows.
37

Rozario, Andrea M., e Chanta Thomas. "Reengineering the Audit with Blockchain and Smart Contracts". Journal of Emerging Technologies in Accounting 16, n.º 1 (1 de março de 2019): 21–35. http://dx.doi.org/10.2308/jeta-52432.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
ABSTRACT Blockchain and smart contracts are evolving business practices by enhancing efficiencies and transparency in the value chain. The fusion of these innovations is also likely to transform auditing by automating workflows but more importantly, by enhancing audit effectiveness and reporting. This paper envisions the future financial statement audit by proposing an external audit blockchain that supports smart audit procedures. The external audit blockchain has the potential to improve audit quality and narrow the expectation gap between auditors, financial statement users, and regulatory bodies.
38

Mohammed, Heba Tallah, Lirije Hyseni, Victoria Bui, Beth Gerritsen, Katherine Fuller, Jihyun Sung e Mohamed Alarakhia. "Exploring the use and challenges of implementing virtual visits during COVID-19 in primary care and lessons for sustained use". PLOS ONE 16, n.º 6 (24 de junho de 2021): e0253665. http://dx.doi.org/10.1371/journal.pone.0253665.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Background The COVID-19 pandemic has rapidly transformed how healthcare is delivered to limit the transmission of the virus. This descriptive cross-sectional study explored the current use of virtual visits in providing care among primary care providers in southwestern Ontario during the first wave of the COVID-19 pandemic and the anticipated level of utilization post-pandemic. It also explored clinicians’ perceptions of the available support tools and resources and challenges to incorporating virtual visits within primary care practices. Methods Primary care physicians and nurse practitioners currently practicing in the southwestern part of Ontario were invited to participate in an online survey. The survey invite was distributed via email, different social media platforms, and newsletters. The survey questions gathered clinicians’ demographic information and assessed their experience with virtual visits, including the proportion of visits conducted virtually (before, during the pandemic, and expected volume post-pandemic), overall satisfaction and comfort level with offering virtual visits using modalities, challenges experienced, as well as useful resources and tools to support them in using virtual visits in their practice. Results We received 207 responses, with 96.6% of respondents offering virtual visits in their practice. Participants used different modalities to conduct virtual visits, with the vast majority offering visits via phone calls (99.5%). Since the COVID-19 pandemic, clinicians who offered virtual visits have conducted an average of 66.4% of their visits virtually, compared to an average of 6.5% pre-pandemic. Participants anticipated continuing use of virtual visits with an average of 43.9% post-pandemic. Overall, 74.5% of participants were satisfied with their experience using virtual visits, and 88% believed they could incorporate virtual visits well within the usual workflow. Participants highlighted some challenges in offering virtual care. For example, 58% were concerned about patients’ limited access to technology, 55% about patients’ knowledge of technology, and 41% about the lack of integration with their current EMR, the increase in demand over time, and the connectivity issues such as inconsistent Wi-Fi/Internet connection. There were significant differences in perception of some challenges between clinicians in urban vs, rural areas. Clinicians in rural areas were more likely to consider the inconsistent Wi-Fi and limited connectivity as barriers to incorporating virtual visits within the practice setting (58.8% vs. 40.2%, P = 0.030). In comparison, clinicians in urban areas were significantly more concerned about patients overusing virtual care services (39.4% vs. 21.6%, P = 0.024). As for support tools, 47% of clinicians advocated for virtual care standards outlined by their profession’s college. About 32% identified change management support and technical training as supportive tools. Moreover, 39% and 28% thought local colleagues and in-house organizational support are helpful resources, respectively. Conclusion Our study shows that the adoption of virtual visits has exponentially increased during the pandemic, with a significant interest in continuing to use virtual care options in the delivery of primary care post-pandemic. The study sheds light on tools and resources that could enhance operational efficiencies in adopting virtual visits in primary care settings and highlights challenges that, when addressed, can expand the health system capacity and sustained use of virtual care.
39

Thethi, Ricky, Dharmik Vadel, Mark Haning e Elizabeth Tellier. "Digital innovation in subsea integrity management". APPEA Journal 60, n.º 1 (2020): 215. http://dx.doi.org/10.1071/aj19123.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Since the 2014 oil-price downturn, the offshore oil and gas industry has accelerated implementation of digital technologies to drive cost efficiencies for exploration and production operations. The upstream offshore sector comprises many interfacing disciplines such as subsurface, drilling and completions, facilities and production operations. Digital initiatives in subsurface imaging, drilling of subsea wells and topsides integrity have been well publicised within the industry. Integrity of the subsea infrastructure is one area that is currently playing catch up in the digital space and lends itself well for data computational efficiencies that artificial-intelligence technologies provide, to reduce cost and lower the risk of subsea equipment downtime. This paper details digital technologies employed in the area of subsea integrity management to meet the objectives of centralising access to critical integrity data, automating workflows to collect and assess data, and using machine learning to perform more accurate and faster engineering analysis with large volumes of field-measured data. A comparison of a typical subsea field is presented using non-digital and digital approaches to subsea integrity management (IM). The comparison demonstrates where technologies such as digital twins for dynamic structures, and auto anomaly detection by using image recognition algorithms can be deployed to provide a step change in the quality of subsea integrity data coming from field. It is demonstrated how the use of a smart IM approach, combined with strong domain knowledge in subsea engineering, can lead to cost efficiencies in operating subsea assets.
40

Thiele, Marco R., e Roderick P. Batycky. "Using Streamline-Derived Injection Efficiencies for Improved Waterflood Management". SPE Reservoir Evaluation & Engineering 9, n.º 02 (1 de abril de 2006): 187–96. http://dx.doi.org/10.2118/84080-pa.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Summary This paper describes a novel approach to predict injection- and production-well rate targets for improved management of waterfloods. The methodology centers on the unique ability of streamlines to define dynamic well allocation factors (WAFs) between injection and production wells. Streamlines allow well allocation factors to be broken down additionally into phase rates at either end of each injector/producer pair. Armed with these unique data, it is possible to define the injection efficiency (IE) for each injector and for injector/producer pairs in a simulation model. The IE quantifies how much oil can be recovered at a producing well for every unit of water injected by an offset injector connected to it. Because WAFs are derived directly from streamlines, the data reflect all the complexities impacting the dynamic behavior of the reservoir model, including the spatial permeability and porosity distributions, fault locations, the underlying computational grid, relative permeability data, pressure/volume/temperature (PVT) properties, and most importantly, historical well rates. The possibility to define IEs through streamline simulation stands in contrast to the ad hoc definition of geometric WAFs and simple surveillance methods used by many practicing reservoir engineers today. Once IEs are known, improved waterflood management can be implemented by reallocating injection water from low-efficiency to high-efficiency injectors. Even in the case in which water cannot be reallocated because of local surface-facility constraints, knowing IEs on an injector/producer pair allows the setting of target rates to maintain oil production while reducing water production. We demonstrate this methodology by first introducing the concept of IEs, then use a small reservoir as an example application. Introduction Local areas of water cycling and poor sweep exist as a flood matures. Current flood management is restricted to surveillance methods or workflows centered on finite-difference (FD) simulation, where areas of bypassed oil are identified and then rate changes, producer/injector conversions, or infill-drilling scenarios are tested. However, identifying and testing improved management scenarios in this way can be laborious, particularly for waterfloods with a large number of wells and/or a relatively high-resolution numerical grid. For mature fields that have potential for improved production without introducing new wells or producer/injector conversions, the main goal is to manage well rates so as to reduce cycling of the injected fluid while maintaining or even increasing oil production. Reservoir engineers have no easy or automated way to identify injection patterns, well-pair connections, or areas of inefficiency beyond simple standard fixed-pattern surveillance techniques (Baker 1997; Baker 1998; Batycky et al. 2005). Such methods are approximate at best owing to the need to define geometric allocation factors and fixed patterns, which suffer from "out-of-pattern" flow. These limitations are removed through streamline-based surveillance models (Batycky et al. 2005). By adding a transport step along streamlines, streamline simulation (3DSL 2006) can additionally identify how much oil production results from an associated injector, quantifying the efficiency down to an individual injector/producer pair. It is this crucial piece of information—the efficiency of an injector/producer pair—that allows an improved estimation of future target rates, leading to improved reservoir flood management.
41

Ueda, Miriam, Juliana Monte Real, Paulo Guilherme A. G. Oliveira, Eloisa de Sa Moreira, Luiz Fernando Lima Reis, Celso FH Granato e Celso Arrais Rodrigues. "A New Strategy Based On qPCR to Optimize Detection and Quantification of Eight Herpesviruses in Patients Undergoing Allogeneic Hematopoietic Stem Cell Transplantation." Blood 120, n.º 21 (16 de novembro de 2012): 3044. http://dx.doi.org/10.1182/blood.v120.21.3044.3044.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Abstract Abstract 3044 Human herpesviruses may cause severe complications after Hematopoietic Stem Cell Transplantation (HSCT) as interstitial pneumonia, encephalitis and post-transplantation lymphoproliferative disease (PTLD). Monitoring these viruses and providing precise, rapid and early diagnosis of related clinical diseases constitute an essential measure to improve outcomes. A prospective survey on the incidence and clinical features of herpesvirus infections after HSCT has not yet been performed in Brazilian patients. Additionally, the impact of most of these infections on the HSCT outcome is still unclear. We sought to develop a test based on real-time polymerase chain reaction (qPCR) to screen and quantify all known human herpesviruses (CMV, EBV, HSV1, HSV2, VZV, HHV6, HHV7 and HHV8) in plasma samples from patients submitted to HSCT. DNA purification from plasma samples has been performed with the QIAamp DNA Blood Mini Kit (manually) or with the QIAamp DNA Blood Mini QIAcube Kit and the QIAcube robot (Qiagen). At least two sets of primers previously described have been tested for each virus for the approach using SYBR Green in order to select for the sets with best efficiency and sensitivity. The sets of primers and TaqMan® probes for the hydrolysis approach have also been previously described. Lambda phage and a commercial internal positive control (IPC, Life Technologies) have been tested as internal controls. The viruses probes were labeled with FAM, while the IPC probe was labeled with VIC. All qPCR reactions have been performed in a 7900HT (Life Technologies). Infected cell cultures and plasma specimens with a known viral load/amplicon copy number have been used as controls. By august 2012, 824 whole blood and plasma were collected from 91 patients. Initially, we tested a screening approach based on three sets of triplex qPCR reactions (including the internal control) using SYBR Green and melting analysis. Although the test showed good linearity across 6 to 7 orders of magnitude in the log scale for most of the targets, the discrimination was poor for low-copy samples (≤ 103 copies of the target/ reaction) or complex samples (positive for more than one virus). We then chose to optimize a strategy based on the use of hydrolysis probes, the gold standard in molecular pathology. Except for EBV, which has been amplified and detected in a duplex reaction along with the IPC, the other amplicons have been screened and quantified in singleplex reactions. All targets presented efficiencies between 90–100% and linearity ranging from at least 25 to 108 copies per reaction. For most of the viruses the lower limit of detection (LOD) is around 5 copies of target per reaction, representing 250 copies/mL of plasma; HHV6 and VZV detection, with sensitivity around 25 copies per reaction, is under further optimization. No cross-reaction or false positive results were detected and within-run and between-run precision estimates are equal or higher than 95%. A semi-automated workflow, using the QIAcube robot for DNA extraction and the QIAgility for reaction setup is under validation. Accuracy will be assigned by testing commercial controls (Acrometrix® plasma panels and controls). Based on the precision of the test, we predict that it will be possible to use this new test to screen batches of 10 samples in 96-well plates or 46 samples in 384-well plates in singlicates for the eight known herpesviruses with high sensitivity and specificity. Only selected samples will then be submitted to fine quantification in a second round of qPCR reactions including the appropriate standard curve(s). This strategy allows for a fast and comprehensive detection and of the known hepersviruses in post-HSCT patients, while integrating the main advantages of the hydrolysis probe, which are high sensitivity and specificity. Disclosures: No relevant conflicts of interest to declare.
42

Sullivan, Clair, e Andrew Staib. "Digital disruption ‘syndromes’ in a hospital: important considerations for the quality and safety of patient care during rapid digital transformation". Australian Health Review 42, n.º 3 (2018): 294. http://dx.doi.org/10.1071/ah16294.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
The digital transformation of hospitals in Australia is occurring rapidly in order to facilitate innovation and improve efficiency. Rapid transformation can cause temporary disruption of hospital workflows and staff as processes are adapted to the new digital workflows. The aim of this paper is to outline various types of digital disruption and some strategies for effective management. A large tertiary university hospital recently underwent a rapid, successful roll-out of an integrated electronic medical record (EMR). We observed this transformation and propose several digital disruption “syndromes” to assist with understanding and management during digital transformation: digital deceleration, digital transparency, digital hypervigilance, data discordance, digital churn and post-digital ‘depression’. These ‘syndromes’ are defined and discussed in detail. Successful management of this temporary digital disruption is important to ensure a successful transition to a digital platform. What is known about this topic? Digital disruption is defined as the changes facilitated by digital technologies that occur at a pace and magnitude that disrupt established ways of value creation, social interactions, doing business and more generally our thinking. Increasing numbers of Australian hospitals are implementing digital solutions to replace traditional paper-based systems for patient care in order to create opportunities for improved care and efficiencies. Such large scale change has the potential to create transient disruption to workflows and staff. Managing this temporary disruption effectively is an important factor in the successful implementation of an EMR. What does this paper add? A large tertiary university hospital recently underwent a successful rapid roll-out of an integrated electronic medical record (EMR) to become Australia’s largest digital hospital over a 3-week period. We observed and assisted with the management of several cultural, behavioural and operational forms of digital disruption which lead us to propose some digital disruption ‘syndromes’. The definition and management of these ‘syndromes’ are discussed in detail. What are the implications for practitioners? Minimising the temporary effects of digital disruption in hospitals requires an understanding that these digital ‘syndromes’ are to be expected and actively managed during large-scale transformation.
43

Ribeiro, Sara, Dorte Wren, Lisa Thompson, Michael Hubank e David Taussig. "Comparison of Three Assays for Identification of IDH Mutations in AML". Blood 134, Supplement_1 (13 de novembro de 2019): 5195. http://dx.doi.org/10.1182/blood-2019-126494.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Introduction Isocitrate dehydrogenase (IDH) mutations are present in up to 20% of acute myeloid leukemia (AML) patients and lead to production of 2-hydroxyglutarate which promotes impaired differentiation and leukemic cell proliferation. Currently, there are two FDA-approved IDH inhibitors for the treatment of AML: enasidenib and ivosidenib. It is now imperative to establish timely testing so that patients can take advantage of these new therapies. Here we compared three different genotyping strategies to identify the most reliable and cost-effective way of picking up IDH1 and IDH2 mutations in AML. Methodology We have tested 60 AML patient samples across three different methods for detection of IDH1/2 mutations: (i) Capillary Electrophoresis-Single Strand Conformational Analysis (CE-SSCA) with Sanger sequencing, (ii) Next generation sequencing (NGS) myeloid capture panel (SureSeq myPanel, Oxford Gene Technology, OGT; UK) and (iii) multi-cancer NGS amplicon panel (QIAseq Targeted DNA Panel, Qiagen, Germany). The NGS capture assay is a 26 gene panel designed for myeloid-related genes while the NGS amplicon panel includes 33 genes associated with haematological and non-haematological malignancies (Figure 1). We have compared the techniques and assessed the quality of the results by measuring gene coverage, read depth across the panels, sensitivity and specificity per test, and performed a final valuation including some practical considerations: cost analysis, staff timings, ease of assay operation, turnaround times (TATs), ease of analysis and reporting as well as batch size limits. We initially tested 424 AML cases with CE-SSCA and have successfully transferred to amplicon panel testing (to date 200 cases). Results The results from the initial 60 cases were concordant between both NGS technologies and Sanger sequencing. In total across all 3 assays, we have detected IDH mutations in 19% of the 624 samples tested (Figure 2). Two samples showed false positive results by CE-SSCA, but these were identified by Sanger sequencing; therefore CE-SSCA alone is an inadequate test and all positive samples by CE-SSCA have to be confirmed by Sanger sequencing. Two samples showed IDH1 R132H mutation with variant allele frequency less than 10%, which was detected by all methods, however the Sanger sequencing trace was very small and could potentially have been missed. The gene minimum read depth for capture NGS was 682 and 677 for amplicon, and the gene mean depth was 1571 and 934 reads, respectively. We did not identify any new genetic variants in IDH1/2. The cost of reagents was higher for the NGS capture panel at $235 per patient compared to amplicon based at $86 per patient, with CE-SSCA/Sanger sequencing combined costing $50. The amplicon NGS allows processing of up to 96 samples per batch, allowing for high throughput testing, while capture NGS only allows for 16 samples to be processed. The library preparation time (Figure 1) is shorter for NGS amplicon (2 days; 12 hours staff time) compared to capture (3 days; 17 hours staff time), with both techniques taking longer to set-up when compared to CE-SSCA (2 hours) and Sanger sequencing (1 day; 6 hours staff time). Looking at assay operation, CE-SSCA and Sanger are the less complex assays allowing for laboratory accessibility, whereas although NGS amplicon was relatively straight forward, NGS capture was slightly more complex, lending itself to specialist laboratory setup. When looking at analysis, CE-SSCA was straightforward; however certain patterns that are very similar to positive controls can lead to false positives. Both NGS analysis were very straightforward, taking <5min per sample, while CE-SSCA/Sanger can take up to 10min per sample in more difficult cases. Importantly, NGS-based approaches incorporate other clinically-relevant genes whereas CE-SSCA is limited to IDH1/2 evaluation with additional testing being required for other targets. Conclusion: We have established a multi-cancer NGS amplicon assay for the detection of IDH mutations in AML patients. It reduces test costs for patients, improving testing efficiencies, allowing additional clinically-relevant genes to be analysed in parallel It has also helped streamline testing for different cancer types which can now all follow the same workflow and be automated thus improving TAT's contributing to better patient management. Acknowledgements: Celgene provided funding for this study. Disclosures Taussig: Celgene: Research Funding.
44

Slavcev, Mary, Allison Spinelli, Elisabeth Absalon, Tara Masterson, Chris Heuck, Annette Lam e Erwin De Cock. "Interim Results of a Time and Motion Survey Regarding Subcutaneous Versus Intravenous Administration of Daratumumab in Patients with Relapsed or Refractory Multiple Myeloma". Blood 136, Supplement 1 (5 de novembro de 2020): 30–31. http://dx.doi.org/10.1182/blood-2020-139995.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Background: In addition to efficacy and safety, patient and healthcare professional (HCP) experiences are important aspects of treatment selection. Daratumumab (DARA) is a humanized monoclonal antibody targeting CD38 and is approved as monotherapy or in combination with standard of care regimens for the treatment of multiple myeloma (MM). Administration of DARA intravenous (IV) takes approximately 7 hours for the first infusion and 3-4 hours for subsequent infusions. To reduce this burden, DARA subcutaneous (SC; DARA 1800 mg coformulated with recombinant human hyaluronidase PH20 [rHuPH20; 2000 U/ml; ENHANZE® drug delivery technology, Halozyme, Inc., San Diego, CA, USA]) was developed. In the phase 3 COLUMBA trial (NCT03277105), at a median follow-up of 7.5 months, overall response and maximum trough concentration with DARA SC were noninferior to DARA IV in patients with relapsed/refractory MM (RRMM). Based on these results, DARA SC was approved by the Food and Drug Administration and European Medicines Agency. Based on an SC injection duration of 5 minutes, administration of DARA SC injections is estimated to take &lt;2 hours (115 minutes) of HCP time during the first year of treatment. A time and motion survey was undertaken to elicit HCPs' understanding of workflow and time estimates for administration of DARA IV and SC (beyond injection time alone) in patients with RRMM. Data collection was halted due to the COVID-19 pandemic. Here, we report the interim survey results. Methods: A web-based, prospective survey was designed to collect primary data from HCPs at sites that actively enrolled patients in the COLUMBA trial. Data were collected for DARA IV and SC regarding time spent on prespecified drug preparation and drug administration/patient care activities; for each task, the respondent's perception of average time and HCP who performs the task were captured. Patient data, including efficacy and safety information, were not collected. The primary endpoints are mean and median active HCP time for each prespecified activity. Time for each activity was adjusted by its probability of occurring (eg, management of infusion-related reactions). Median results are reported here, as these are considered a better measure of central tendency than the mean. Total median active HCP time was calculated by summing median time for all prespecified activities (drug preparation activities vs activities in the patient care area/infusion suite). A post-hoc analysis estimated patient chair time based on HCP inputs for pre-treatment activities, infusion/injection duration, and post-treatment activities. A sensitivity analysis was conducted comparing a subgroup of respondents with fully validated data with the overall study population. Results: A total of 26 respondents from 8 countries (Brazil, Israel, Japan, Greece, Poland, Sweden, Taiwan, and Ukraine) completed the survey. For DARA IV, the median total active HCP time was 294.2 minutes for the first infusion and 194.9 minutes for subsequent infusions (Figure 1). For DARA SC, the median total active HCP time was 98.7 minutes for the first injection (66.5% reduction in time vs DARA IV) and 82.2 minutes for subsequent injections (57.8% reduction in time vs DARA IV) (Figure 1). For both treatments, the proportions of time spent on drug preparation vs drug administration/patient care were roughly similar for first and subsequent administrations (Table). When extrapolated for year 1 and year 2 (23 administrations in year 1 and 13 in year 2, as per label), estimated active HCP time per patient was 76.4 and 42.2 hours, respectively, for DARA IV and 31.8 and 17.8 hours, respectively, for DARA SC. Estimated chair time for DARA IV was 445.6 minutes for the first infusion and 243.1 minutes for subsequent infusions; for DARA SC, estimated chair time for first and subsequent injections was 8.6 and 6.9 minutes, respectively (Figure 2). Results were confirmed by a sensitivity analysis using fully validated data for 13 of the 26 respondents. Conclusions: Results of this time and motion survey indicate that DARA SC is associated with less active HCP time spent on drug preparation and drug administration/patient care compared with DARA IV. This reduced treatment burden may translate into advantages for patients (less time away from home, family, and/or work) and efficiencies for HCPs and healthcare facilities (ability to treat a greater number of patients). Disclosures Slavcev: Janssen: Current Employment. Spinelli:Janssen: Current Employment. Absalon:Syneos Health: Current Employment. Masterson:Janssen: Current Employment. Heuck:Janssen: Current Employment. Lam:Janssen: Current Employment. De Cock:Syneos Health: Current Employment.
45

Aithal, Adithya, Manish Kumar Singh, Papri Ray e R. Duraipandian. "An Integrated Approach to understand Supply Chain Optimization through the Lens of Technology". Shanlax International Journal of Management 8, S1-Feb (26 de fevereiro de 2021): 167–78. http://dx.doi.org/10.34293/management.v8is1-feb.3772.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Supply chain management and optimisation is a crucial aspect of modern organisations and a successful research area. As an important problem for efficient capacity utilization and difficult infrastructure choices, the presence of uncertainty within supply chains is addressed.Organizations are continually refining processes by implementing business process improvement solutions to deliver better business outcomes, even in times of disruption. A lack of transaction and inventory visibility may be one of the important problems at the moment. With the assistance of intelligent workflows, technology can help develop supply chain processes to eliminate operational silos, respond to market disruptions, minimize risk and sustain business continuity. The paper discusses on how use of embedded AI capabilities in supply chain will provide real-time intelligence and actionable recommendations. Data-driven insights help increase efficiencies and reduce costs. Technologies such as Blockchain, IoT, Analytics, Software Process Improvement and many more can impact the supply chain immensely. The paper mentions about various technology including blockchain and how it improves the supply chain by improving traceability, auditability, accountability, actionability and visibility. The right technology helps the business to make the supply chain more accessible, gain more leverage over the inventory, reduce operational costs, and eventually outpace competition.
46

McGibbon, Scott, Mohamed Abdel-Wahab e Ming Sun. "Towards a digitised process-wheel for historic building repair and maintenance projects in Scotland". Journal of Cultural Heritage Management and Sustainable Development 8, n.º 4 (19 de novembro de 2018): 465–80. http://dx.doi.org/10.1108/jchmsd-08-2017-0053.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Purpose With the increasing demand for high-quality economical and sustainable historic building repair and maintenance (R&M) allied with the perennial problem of skills shortages (project management (PM) and on-site practice) investment in new technologies becomes paramount for modernising training and practice. Yet, the historic R&M industry, in particular small- and medium-sized enterprises have yet to benefit from digital technologies (such as laser scanning, virtual reality and cloud computing) which have the potential to enhance performance and productivity. The paper aims to discuss these issues. Design/methodology/approach A qualitative participatory action research approach was adopted. One demonstration project (Project A) exhibiting critical disrepair, showcasing the piloting of a five phased digitised “process-wheel” intended to provide a common framework for facilitating collaboration of project stakeholders thereby aiding successful project delivery is reported. Five semi-structured interviews were conducted with industry employers to facilitate the process-wheel concept development. Findings Implementing only Phase 1 of the digitised “process-wheel” (e-Condition surveying incorporating laser scanning) resulted in an estimated 25-30 per cent cost and time savings, when compared to conventional methods. The accrued benefits are twofold: provide a structured standardised data capturing approach that is shared in a common project repository amongst relevant stakeholders; inform the application of digital technologies to attain efficiencies across various phases of the process-wheel. Originality/value This paper has provided original and valuable information on the benefits of modernising R&M practice, highlighting the importance of continued investment in innovative processes and new technologies for historic building R&M to enhance existing practice and in form current training provision. Future work will focus on further piloting and validation of the process-wheel in its entirety on selected demonstration projects with a view of supporting the industry to digitise its workflows and going fully digital to realise optimum process efficiencies.
47

Holt, Cydne L., Kathryn M. Stephens, Paulina Walichiewicz, Keenan D. Fleming, Elmira Forouzmand e Shan-Fu Wu. "Human Mitochondrial Control Region and mtGenome: Design and Forensic Validation of NGS Multiplexes, Sequencing and Analytical Software". Genes 12, n.º 4 (19 de abril de 2021): 599. http://dx.doi.org/10.3390/genes12040599.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
Forensic mitochondrial DNA (mtDNA) analysis conducted using next-generation sequencing (NGS), also known as massively parallel sequencing (MPS), as compared to Sanger-type sequencing brings modern advantages, such as deep coverage per base (herein referred to as read depth per base pair (bp)), simultaneous sequencing of multiple samples (libraries) and increased operational efficiencies. This report describes the design and developmental validation, according to forensic quality assurance standards, of end-to-end workflows for two multiplexes, comprised of ForenSeq mtDNA control region and mtDNA whole-genome kits the MiSeq FGxTM instrument and ForenSeq universal analysis software (UAS) 2.0/2.1. Polymerase chain reaction (PCR) enrichment and a tiled amplicon approach target small, overlapping amplicons (60–150 bp and 60–209 bp for the control region and mtGenome, respectively). The system provides convenient access to data files that can be used outside of the UAS if desired. Studies assessed a range of environmental and situational variables, including but not limited to buccal samples, rootless hairs, dental and skeletal remains, concordance of control region typing between the two multiplexes and as compared to orthogonal data, assorted sensitivity studies, two-person DNA mixtures and PCR-based performance testing. Limitations of the system and implementation considerations are discussed. Data indicated that the two mtDNA multiplexes, MiSeq FGx and ForenSeq software, meet or exceed forensic DNA quality assurance (QA) guidelines with robust, reproducible performance on samples of various quantities and qualities.
48

Mhangara e Mapurisa. "Multi-Mission Earth Observation Data Processing System". Sensors 19, n.º 18 (4 de setembro de 2019): 3831. http://dx.doi.org/10.3390/s19183831.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
The surge in the number of earth observation satellites being launched worldwide is placing significant pressure on the satellite-direct ground receiving stations that are responsible for systematic data acquisition, processing, archiving, and dissemination of earth observation data. Growth in the number of satellite sensors has a bearing on the ground segment payload data processing systems due to the complexity, volume, and variety of the data emanating from the different sensors. In this paper, we have aimed to present a generic, multi-mission, modularized payload data processing system that we are implementing to optimize satellite data processing from historical and current sensors, directly received at the South African National Space Agency’s (SANSA) ground receiving station. We have presented the architectural framework for the multi-mission processing system, which is comprised of five processing modules, i.e., the data ingestion module, a radiometric and geometric processing module, atmospheric correction and Analysis Ready Data (ARD) module, Value Added Products (VAPS) module, and lastly, a packaging and delivery module. Our results indicate that the open architecture, multi-mission processing system, when implemented, eliminated the bottlenecks linked with proprietary mono-mission systems. The customizable architecture enabled us to optimize our processing in line with our hardware capacities, and that resulted in significant gains in large-scale image processing efficiencies. The modularized, multi-mission data processing enabled seamless end-to-end image processing, as demonstrated by the capability of the multi-mission system to execute geometric and radiometric corrections to the extent of making it analysis-ready. The processing workflows were highly scalable and enabled us to generate higher-level thematic information products from the ingestion of raw data.
49

Leen, Robert. "Shape optimisation of a snowboard binding highback. A case study of generative design process comparison". KnE Engineering 2, n.º 2 (9 de fevereiro de 2017): 73. http://dx.doi.org/10.18502/keg.v2i2.598.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
<div><p>FEA software is traditionally expensive to purchase, takes a high level of technical skill and understanding and requires users to dedicate years to develop specialist skills. With the increasing popularity of more user-friendly, elementary software packages such as Fusion360, more cost effective and efficient processes can be developed and harnessed, especially by SME’s and designers that don’t have the ability to purchase expensive software packages. One particular FEA element that has recently begun transitioning from highly specialised to more readily available is ‘generative design’ and ‘shape optimisation.' Shape optimisation has only been able to be utilised by large corporations with large research and development budgets. This case study looks at exploring and optimising the methods involved in generative design for product development and it’s aimed at facilitating practises for small to medium enterprises (SME’s).</p><p>The work described in this paper presents a study using a snowboard binding highback component which was reverse engineered using 3D scanning. A blank model, free of any discerning features was created from the scan and then used as the platform for the generative design phase. This process was completed using easily accessible software (Fusion 360) as well as high-end professional software (Ansys 16). A comparison between the two workflows analyses the resultant model outcomes and outlines efficiencies regarding processing time, technical skill, and latent difficulties of the entry-level process for generative design of the snowboarding high back.</p><p>This paper aims to demonstrate and describe an optimisation model for generative design and shape optimisation during entry-level product development.</p></div>
50

Rodgers, Georgina T. "Development of an infusion nurse staffing model for outpatient chemotherapy centers." Journal of Clinical Oncology 34, n.º 7_suppl (1 de março de 2016): 103. http://dx.doi.org/10.1200/jco.2016.34.7_suppl.103.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Resumo:
103 Background: Nursing roles and responsibilities within ambulatory oncology infusion suites across our health system are not clearly defined and it is not understood what the appropriate staffing ratio should be per site. It is not clear if employees are working to the highest level of their licensure or skill, and if the appropriate activities are performed by the correct department. A standard staffing model to provide efficiency of clinical services and patient safety does not exist, and nursing roles are variable between the sites. Similar patient populations are being treated at each site and the variability of the roles introduces unnecessary costs to the system as a whole. The purpose of this project was to define roles of the infusion nurse to insure performance to the highest level of licensure, create efficiencies within the clinical setting, potentially reduce RN staffing requirements, achieve a cost savings, and develop a target nurse to patient ratio while maintaining quality care. Methods: Daily patient volume and hours of operation were compiled for each outpatient site and three methodologies were used to determine nurse to patient ratio. We utilized an acuity based ratio tool, hours per unit (HPU) method using billed charges for technical procedures and finally a simple 1:6 ratio based upon patient volume. Each methodology showed similar results and a final target ratio of 1:6 was chosen. Results: A staffing template was created to predict the number of RN’s necessary for treatment and an analysis of infusion sites was also completed to observe workflows and determine potential staffing adjustments. Our pilot site was initially staffed with 14 RN FTE’s and analysis showed many non-clinical, non-nursing duties were being performed by RN’s. Through process improvement we have created clear role delineation and the site currently functions with 5 RN FTE’s. We have maximized the efficiency of the nursing team, reduced costs, and there has been no decline or compromise in quality or patient safety. Conclusions: The implications of establishing this standard for infusion nursing has allowed us to duplicate the methodology across the health system and achieve a level of staffing that matches well with patient care needs.

Vá para a bibliografia