Academic literature on the topic 'Data Workflow Efficiency'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Data Workflow Efficiency.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Data Workflow Efficiency"

1

Graham, Andrew, Stephan Gmur, and Travis Scott. "Improved SCAT data workflow to increase efficiency and data accuracy." International Oil Spill Conference Proceedings 2017, no. 1 (2017): 2674–93. http://dx.doi.org/10.7901/2169-3358-2017.1.2674.

Full text
Abstract:
ABSTRACT #2017-302 Traditional Shoreline Cleanup Assessment Technique (SCAT) data workflows typically entail collecting data in the field using notebooks, handheld GPS units and digital cameras, transcribing these data onto paper forms, and then manually entering into a local database. Processed data are pushed to a SCAT geographical information system (GIS) specialist, ultimately providing exports as paper and electronic versions of maps, spreadsheets and reports. The multiple and sometimes iterative steps required can affect the dissemination of accurate and timely information to decision ma
APA, Harvard, Vancouver, ISO, and other styles
2

Weigel, Tobias, Ulrich Schwardmann, Jens Klump, Sofiane Bendoukha, and Robert Quick. "Making Data and Workflows Findable for Machines." Data Intelligence 2, no. 1-2 (2020): 40–46. http://dx.doi.org/10.1162/dint_a_00026.

Full text
Abstract:
Research data currently face a huge increase of data objects with an increasing variety of types (data types, formats) and variety of workflows by which objects need to be managed across their lifecycle by data infrastructures. Researchers desire to shorten the workflows from data generation to analysis and publication, and the full workflow needs to become transparent to multiple stakeholders, including research administrators and funders. This poses challenges for research infrastructures and user-oriented data services in terms of not only making data and workflows findable, accessible, int
APA, Harvard, Vancouver, ISO, and other styles
3

Mahaboobsubani, Shaik. "Enhancing Workflow Efficiency in Finance with O365 Graph API." Journal of Scientific and Engineering Research 9, no. 2 (2022): 178–85. https://doi.org/10.5281/zenodo.14356693.

Full text
Abstract:
The use of the O365 Graph API to improve workflow efficiency in operations related to finance. The article is based on the capability of the Graph API to integrate data, automate business processes, and have real-time access to Microsoft services for seamless financial processes that reduce manual intervention and increase productivity. Among the main data integration methods discussed in the paper are automated data retrievals from SharePoint, Teams, and One Drive. It also quantifies the efficiency gains by reducing processing times, error rates, and operation costs compared to traditional ma
APA, Harvard, Vancouver, ISO, and other styles
4

Hribar, Michelle R., Sarah Read-Brown, Isaac H. Goldstein, et al. "Secondary use of electronic health record data for clinical workflow analysis." Journal of the American Medical Informatics Association 25, no. 1 (2017): 40–46. http://dx.doi.org/10.1093/jamia/ocx098.

Full text
Abstract:
Abstract Objective Outpatient clinics lack guidance for tackling modern efficiency and productivity demands. Workflow studies require large amounts of timing data that are prohibitively expensive to collect through observation or tracking devices. Electronic health records (EHRs) contain a vast amount of timing data – timestamps collected during regular use – that can be mapped to workflow steps. This study validates using EHR timestamp data to predict outpatient ophthalmology clinic workflow timings at Oregon Health and Science University and demonstrates their usefulness in 3 different studi
APA, Harvard, Vancouver, ISO, and other styles
5

Liew, Chee Sun, Malcolm P. Atkinson, Radosław Ostrowski, Murray Cole, Jano I. van Hemert, and Liangxiu Han. "Performance database: capturing data for optimizing distributed streaming workflows." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 369, no. 1949 (2011): 3268–84. http://dx.doi.org/10.1098/rsta.2011.0134.

Full text
Abstract:
The performance database (PDB) stores performance-related data gathered during workflow enactment. We argue that, by carefully understanding and manipulating these data, we can improve efficiency when enacting workflows. This paper describes the rationale behind the PDB, and proposes a systematic way to implement it. The prototype is built as part of the Advanced Data Mining and Integration Research for Europe project. We use workflows from real-world experiments to demonstrate the usage of PDB.
APA, Harvard, Vancouver, ISO, and other styles
6

Ma, Yinglong, Liying Zhang, and Moyi Shi. "Measuring Semantic and Structural Information for Data Oriented Workflow Retrieval with Cost Constraints." Mathematical Problems in Engineering 2014 (2014): 1–9. http://dx.doi.org/10.1155/2014/407809.

Full text
Abstract:
The reuse of data oriented workflows (DOWs) can reduce the cost of workflow system development and control the risk of project failure and therefore is crucial for accelerating the automation of business processes. Reusing workflows can be achieved by measuring the similarity among candidate workflows and selecting the workflow satisfying requirements of users from them. However, due to DOWs being often developed based on an open, distributed, and heterogeneous environment, different users often can impose diverse cost constraints on data oriented workflows. This makes the reuse of DOWs challe
APA, Harvard, Vancouver, ISO, and other styles
7

Chakroborti, Debasish, Banani Roy, and Sristy Sumana Nath. "Designing for Recommending Intermediate States in A Scientific Workflow Management System." Proceedings of the ACM on Human-Computer Interaction 5, EICS (2021): 1–29. http://dx.doi.org/10.1145/3457145.

Full text
Abstract:
To process a large amount of data sequentially and systematically, proper management of workflow components (i.e., modules, data, configurations, associations among ports and links) in a Scientific Workflow Management System (SWfMS) is inevitable. Managing data with provenance in a SWfMS to support reusability of workflows, modules, and data is not a simple task. Handling such components is even more burdensome for frequently assembled and executed complex workflows for investigating large datasets with different technologies (i.e., various learning algorithms or models). However, a great many
APA, Harvard, Vancouver, ISO, and other styles
8

Wen, Yiping, Junjie Hou, Zhen Yuan, and Dong Zhou. "Heterogeneous Information Network-Based Scientific Workflow Recommendation for Complex Applications." Complexity 2020 (March 19, 2020): 1–16. http://dx.doi.org/10.1155/2020/4129063.

Full text
Abstract:
Scientific workflow is a valuable tool for various complicated large-scale data processing applications. In recent years, the increasingly growing number of scientific processes available necessitates the development of recommendation techniques to provide automatic support for modelling scientific workflows. In this paper, with the help of heterogeneous information network (HIN) and tags of scientific workflows, we organize scientific workflows as a HIN and propose a novel scientific workflow similarity computation method based on metapath. In addition, the density peak clustering (DPC) algor
APA, Harvard, Vancouver, ISO, and other styles
9

Satyam, Chauhan. "Enterprise Data Lakes for Financial Services." INTERNATIONAL JOURNAL OF INNOVATIVE RESEARCH AND CREATIVE TECHNOLOGY 5, no. 5 (2019): 1–17. https://doi.org/10.5281/zenodo.14384176.

Full text
Abstract:
Enterprise Data Lakes (EDLs) have emerged as transformative solutions for financial institutions seeking to address challenges associated with fragmented data systems, inefficient workflows, and stringent regulatory compliance. This paper presents an AWS-centric framework for building resilient and scalable EDLs tailored to the needs of financial services. Key aspects such as controlled change management, robust failover protocols, workflow orchestration, and secure data sharing are analyzed in detail. The study explores the design and implementation of data pipelines, leveraging AWS tools lik
APA, Harvard, Vancouver, ISO, and other styles
10

Wu, Keyao, and Shu Tang. "BIM-Assisted Workflow Enhancement for Architecture Preliminary Design." Buildings 12, no. 5 (2022): 601. http://dx.doi.org/10.3390/buildings12050601.

Full text
Abstract:
China’s urban housing demand has directly influenced urbanization development. To stabilize the level of urbanization, it is urgent to optimize the whole life-cycle efficiency of construction and the preliminary design as the first step is even more significant. Building Information Modeling (BIM) is widely used as information technology in the construction industry to promote the implementation and management of projects. However, the traditional preliminary design approach still occupies the mainstream market without forming a systematic BIM workflow, which causes inefficiency. To address th
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Data Workflow Efficiency"

1

Pineda, Morales Luis Eduardo. "Efficient support for data-intensive scientific workflows on geo-distributed clouds." Thesis, Rennes, INSA, 2017. http://www.theses.fr/2017ISAR0012/document.

Full text
Abstract:
D’ici 2020, l’univers numérique atteindra 44 zettaoctets puisqu’il double tous les deux ans. Les données se présentent sous les formes les plus diverses et proviennent de sources géographiquement dispersées. L’explosion de données crée un besoin sans précédent en terme de stockage et de traitement de données, mais aussi en terme de logiciels de traitement de données capables d’exploiter au mieux ces ressources informatiques. Ces applications à grande échelle prennent souvent la forme de workflows qui aident à définir les dépendances de données entre leurs différents composants. De plus en plus
APA, Harvard, Vancouver, ISO, and other styles
2

Ikken, Sonia. "Efficient placement design and storage cost saving for big data workflow in cloud datacenters." Thesis, Evry, Institut national des télécommunications, 2017. http://www.theses.fr/2017TELE0020/document.

Full text
Abstract:
Les workflows sont des systèmes typiques traitant le big data. Ces systèmes sont déployés sur des sites géo-distribués pour exploiter des infrastructures cloud existantes et réaliser des expériences à grande échelle. Les données générées par de telles expériences sont considérables et stockées à plusieurs endroits pour être réutilisées. En effet, les systèmes workflow sont composés de tâches collaboratives, présentant de nouveaux besoins en terme de dépendance et d'échange de données intermédiaires pour leur traitement. Cela entraîne de nouveaux problèmes lors de la sélection de données distri
APA, Harvard, Vancouver, ISO, and other styles
3

Ikken, Sonia. "Efficient placement design and storage cost saving for big data workflow in cloud datacenters." Electronic Thesis or Diss., Evry, Institut national des télécommunications, 2017. http://www.theses.fr/2017TELE0020.

Full text
Abstract:
Les workflows sont des systèmes typiques traitant le big data. Ces systèmes sont déployés sur des sites géo-distribués pour exploiter des infrastructures cloud existantes et réaliser des expériences à grande échelle. Les données générées par de telles expériences sont considérables et stockées à plusieurs endroits pour être réutilisées. En effet, les systèmes workflow sont composés de tâches collaboratives, présentant de nouveaux besoins en terme de dépendance et d'échange de données intermédiaires pour leur traitement. Cela entraîne de nouveaux problèmes lors de la sélection de données distri
APA, Harvard, Vancouver, ISO, and other styles
4

Veit, Johannes [Verfasser], and Oliver [Akademischer Betreuer] Kohlbacher. "Efficient Workflows for Analyzing High-Performance Liquid Chromatography Mass Spectrometry-Based Proteomics Data / Johannes Veit ; Betreuer: Oliver Kohlbacher." Tübingen : Universitätsbibliothek Tübingen, 2019. http://d-nb.info/1193489288/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Gosewisch, Astrid [Verfasser], and Guido [Akademischer Betreuer] Böning. "Patient-specific bone marrow dosimetry in Lu-177-based radionuclide therapy: investigation of efficient data acquisition protocols and a clinical Monte Carlo dosimetry workflow for 3D absorbed dose modelling / Astrid Gosewisch ; Betreuer: Guido Böning." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2020. http://d-nb.info/1211957306/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Barika, MSM. "Scheduling techniques for efficient execution of stream workflows in cloud environments." Thesis, 2020. https://eprints.utas.edu.au/34812/1/Barika_whole_thesis.pdf.

Full text
Abstract:
Advancements in Internet of Things (IoT) technology have led to the development of advanced applications and services that rely on data generated from enormous amounts of connected devices such as sensors, mobile devices and smart cars. These applications process and analyse such data as it arrives to unleash the potential of live analytics. Considering that our future world will be fully automated, current IoT applications and services are categorised as data-driven workflows, which integrate multiple analytical components. Examples of these workflow applications are smart farming, smart reta
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Data Workflow Efficiency"

1

Petchey, Owen L., Andrew P. Beckerman, Natalie Cooper, and Dylan Z. Childs. Insights from Data with R. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198849810.001.0001.

Full text
Abstract:
Knowledge of how to get useful information from data is essential in the life and environmental sciences. This book provides learners with knowledge, experience, and confidence about how to efficiently and reliably discover useful information from data. The content is developed from first- and second-year undergraduate-level courses taught by the authors. It charts the journey from question, to raw data, to clean and tidy data, to visualizations that provide insights. This journey is presented as a repeatable workflow fit for use with many types of question, study, and data. Readers discover h
APA, Harvard, Vancouver, ISO, and other styles
2

Westby, Emma Jane. Git for Teams: A User-Centered Approach to Creating Efficient Workflows in Git. O'Reilly Media, Incorporated, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Martich, Daniel, and Jody Cervenak. Integration of information technology in the ICU. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780199600830.003.0007.

Full text
Abstract:
As we look to the evolving health care industry with improved care quality, health outcomes, and cost parameters, the demands of the critical care environment require a transformation. Technology, process, and people are at the centre of this transformation. The power is in the knowledge that can be achieved and the process improvements that can be made through automation. Five major areas of technology evolution include workflow automation, information exchange, clinical decision support, and predictive modelling, remote monitoring, and data analytics. If designed properly, technology can res
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Data Workflow Efficiency"

1

Senthil, Arjun K. S., Thiruvaazhi Uloli, and V. M. Sanjay. "Designing a hyperledger fabric-based workflow management systemml: A prototype solution to enhance organizational efficiency." In Applied Data Science and Smart Systems. CRC Press, 2024. http://dx.doi.org/10.1201/9781003471059-41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ramduny, Jivesh, Mélanie Garcia, and Clare Kelly. "Establishing a Reproducible and Sustainable Analysis Workflow." In Neuromethods. Springer US, 2024. https://doi.org/10.1007/978-1-0716-4260-3_4.

Full text
Abstract:
AbstractGetting started on any project is often the hardest thing—and when it comes to starting your career in research, just figuring out where and how to start can seem like an insurmountable challenge. This is particularly true at this moment—when there are so many programming languages, programs, and systems that are freely available to neuroimaging researchers, and even more guides, tutorials, and courses on how to use them. This chapter is intended to set you off on the right foot as you get stuck into the task of learning to work with large neuroimaging data. We will cover a number of p
APA, Harvard, Vancouver, ISO, and other styles
3

Ioshchikhes, Borys, Daniel Piendl, Henrik Schmitz, Jasper Heiland, and Matthias Weigold. "Development of a Holistic Framework for Identifying Energy Efficiency Potentials of Production Machines." In Lecture Notes in Mechanical Engineering. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-28839-5_48.

Full text
Abstract:
AbstractA prerequisite to identify energy efficiency potentials and to improve energy efficiency is the measurement and analysis of the energy demand. However, in industrial practice, approaches to identify energy efficiency measures of production machines are associated with high costs for metering equipment and time consuming analysis requiring expertise. Against this background, this paper describes a comprehensive and cost-efficient framework from acquisition to analysis of energy data to serve as a starting point to increase energy efficiency in manufacturing. For this purpose, an energy
APA, Harvard, Vancouver, ISO, and other styles
4

Hofer, Marvin, Sebastian Hellmann, Milan Dojchinovski, and Johannes Frey. "The New DBpedia Release Cycle: Increasing Agility and Efficiency in Knowledge Extraction Workflows." In Semantic Systems. In the Era of Knowledge Graphs. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-59833-4_1.

Full text
Abstract:
Abstract Since its inception in 2007, DBpedia has been constantly releasing open data in RDF, extracted from various Wikimedia projects using a complex software system called the DBpedia Information Extraction Framework (DIEF). For the past 12 years, the software received a plethora of extensions by the community, which positively affected the size and data quality. Due to the increase in size and complexity, the release process was facing huge delays (from 12 to 17 months cycle), thus impacting the agility of the development. In this paper, we describe the new DBpedia release cycle including
APA, Harvard, Vancouver, ISO, and other styles
5

Copăcenaru, Olimpia, Adrian Stoica, Antonella Catucci, et al. "Copernicus Data and CAP Subsidies Control." In Big Data in Bioeconomy. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-71069-9_20.

Full text
Abstract:
AbstractThis chapter integrates the results of three pilots developed within the framework of the Horizon 2020 DataBio project. It aims to provide a broad picture of how products based on Earth Observation techniques can support the European Union’s Common Agricultural Policy requirements, whose fulfillments are supervised by National and Local Paying Agencies operating in Romania, Italy and Greece. The concept involves the use of the same data sources, mainly multitemporal series of Copernicus Sentinel-2 imagery, but through three different Big Data processing chains, tailored to each paying
APA, Harvard, Vancouver, ISO, and other styles
6

Odame, Prince Kwame, and Ebenezer Nana Kwaku Boateng. "Leveraging Spatial Technology for Agricultural Intensification to Address Hunger in Ghana." In Sustainable Development Goals Series. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-05182-1_3.

Full text
Abstract:
AbstractYouthMappers are using open geospatial tools in support of initiatives seeking to achieve SGD 2 Zero Hunger and SDG 1 No Poverty in Northern Ghana. Students and researchers designed survey questions and a field data collection workflow using simple but cost-effective technology to catalogue a database of farmers, properly demarcate farm sizes, and give farmers, in particular impoverished women, the opportunity to project farm yields and increase the efficiency of their output.
APA, Harvard, Vancouver, ISO, and other styles
7

Filipp, Peter, Rene Dorsch, and Andreas Harth. "EVErPREP: Towards an Event Knowledge Graph Enhanced Workflow Model for Event Log Preparation." In Lecture Notes in Business Information Processing. Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-82225-4_3.

Full text
Abstract:
Abstract Event data preparation is a critical yet time-consuming phase in process mining projects, often slowed down by complex relational data models and a lack of domain knowledge. This paper presents EVErPREP, a novel workflow model that leverages Event Knowledge Graphs to enhance event data preparation for event logs. EVErPREP uses Semantic Web technologies to improve the exploration, extraction, and processing of event data, ultimately improving the quality and interpretability of event data and event logs. The approach is evaluated through a case study at Munich Airport’s Baggage Handlin
APA, Harvard, Vancouver, ISO, and other styles
8

Cruz, Yarens J., Fernando Castaño, Rodolfo E. Haber, et al. "Self-Reconfiguration for Smart Manufacturing Based on Artificial Intelligence: A Review and Case Study." In Artificial Intelligence in Manufacturing. Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-46452-2_8.

Full text
Abstract:
AbstractSelf-reconfiguration in manufacturing systems refers to the ability to autonomously execute changes in the production process to deal with variations in demand and production requirements while ensuring a high responsiveness level. Some advantages of these systems are their improved efficiency, flexibility, adaptability, and cost-effectiveness. Different approaches can be used for designing self-reconfigurable manufacturing systems, including computer simulation, data-driven methods, and artificial intelligence-based methods. To assess an artificial intelligence-based solution focused
APA, Harvard, Vancouver, ISO, and other styles
9

Lehmann, Henry, and Bernhard Jung. "Virtual Prototyping of Metal Melt Filters: A HPC-Based Workflow for Query-Driven Visualization." In Multifunctional Ceramic Filter Systems for Metal Melt Filtration. Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-40930-1_18.

Full text
Abstract:
AbstractThe advent of additive manufacturing has vastly increased the design space of filters for metal melt filtration. In order assess the large amount of possible filter designs before they are actually manufactured, a virtual prototyping workflow based on HPC simulations is proposed. A major challenge are the large data volumes produced even by single CFD simulations of metal melt flow and even more so by virtual prototyping settings, where a large number of filter designs must be evaluated. The LITE-QA framework addresses the large data problem by providing data management methods support
APA, Harvard, Vancouver, ISO, and other styles
10

Hachinger, Stephan, Martin Golasowski, Jan Martinovič, et al. "Leveraging High-Performance Computing and Cloud Computing with Unified Big-Data Workflows: The LEXIS Project." In Technologies and Applications for Big Data Value. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78307-5_8.

Full text
Abstract:
AbstractTraditional usage models of Supercomputing centres have been extended by High-Throughput Computing (HTC), High-Performance Data Analytics (HPDA) and Cloud Computing. The complexity of current compute platforms calls for solutions to simplify usage and conveniently orchestrate computing tasks. These enable also non-expert users to efficiently execute Big Data workflows. In this context, the LEXIS project (‘Large-scale EXecution for Industry and Society’, H2020 GA 825532, https://lexis-project.eu) sets up an orchestration platform for compute- and data-intensive workflows. Its main objec
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Data Workflow Efficiency"

1

Girisagar, Vishal, Tram Truong-Huu, and Mohan Gurusamy. "Mapping workflow resource requests for bandwidth efficiency in data centers." In ICDCN '16: 17th International Conference on Distributed Computing and Networking. ACM, 2016. http://dx.doi.org/10.1145/2833312.2833448.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Azari, V., D. Gonzalez, A. Edirisooriya, et al. "Reservoir Fluid Data Workflow Management to Enhance the Operational Efficiency." In Fifth EAGE Digitalization Conference & Exhibition. European Association of Geoscientists & Engineers, 2025. https://doi.org/10.3997/2214-4609.202539071.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kowalchuk, P., A. Grotte, S. Brandsberg-Dahl, V. Sabharwal, and U. V. Jensen. "Large Language Model-Based Workflow for Optimizing Offset Well Data Analysis and Generating Well Design Risk Profiles." In Offshore Technology Conference. OTC, 2025. https://doi.org/10.4043/35607-ms.

Full text
Abstract:
Abstract Today's workflows for designing and approving new wells depend on modern well engineering practices that integrate input from multiple disciplines like subsurface, engineering, and drilling. Effective execution and planning within these workflows involve complex data analysis including data from historical events from nearby- and offset-wells. Accessing and preparing all this data requires extensive data reviews, that when relying on the historically mainly manual processes and tools, can be both time-consuming and resource intensive. To achieve a step change in efficiency there is a
APA, Harvard, Vancouver, ISO, and other styles
4

Chen, Wei, Zhengxin Zhang, Yuelin Shen, et al. "Method For Real-Time Bit Efficiency and Damage Monitoring; Leveraging Offset Data and Domain Knowledge." In SPE/IADC International Drilling Conference and Exhibition. SPE, 2025. https://doi.org/10.2118/223696-ms.

Full text
Abstract:
Abstract The ability to predict the useful life remaining for a drill bit or estimate the dull grading in real time while drilling with high accuracy is valuable for planning and real-time decisions within well construction. Domain experts and software engineers have developed a workflow that utilizes the company's global offset well database and bit knowledge to feed an intelligent decision-making system that can enhance the drilling process and prevent excessive bit damage by calculating the efficiency of the bit and predicting if the dull grading at the end of the run will be beyond a certa
APA, Harvard, Vancouver, ISO, and other styles
5

Temer, Elias, Hakim Arabi, Sameer Bisht, et al. "Extending Automated Drilling Capabilities by Enabling Automated Data Aggregation and Drilling Advisory Workflows - Case Study." In International Petroleum Technology Conference. IPTC, 2025. https://doi.org/10.2523/iptc-25093-ms.

Full text
Abstract:
Abstract Automation is revolutionizing efficiency in various sectors, notably in oil and gas, by streamlining tasks to enhance speed, safety, and labor reduction. On-target well delivery solution systems integrate automation, advisory systems, and data aggregation, leveraging AI to optimize drilling operations while ensuring compliance with well-construction plans. In drilling edge operations, data aggregation systems collect and manage information from various sources using standardized protocols, addressing challenges such as hardware duplication and network complexity. In this paper we prop
APA, Harvard, Vancouver, ISO, and other styles
6

Fajardo, Felix J., Felix Jahn, and Kjetil Austbø. "Integrated Telemetry, Flow Simulator, and Data Visualization Software Streamline Cleanout Operations with Coiled Tubing." In SPE/ICoTA Well Intervention Conference and Exhibition. SPE, 2025. https://doi.org/10.2118/224079-ms.

Full text
Abstract:
Abstract Coiled Tubing (CT) cleanout operations in subsea wells with low reservoir pressure present unique challenges, particularly in monitoring return flow rates when instrumentation is not available. This paper introduces a novel workflow that integrates telemetry, dynamic multiphase flow simulation, and data visualization to optimize solid transport, reduce risks, and enhance operational efficiency. Designed for vessel-based light well interventions, the workflow offers a more agile and responsive alternative to traditional fixed-platform monitoring methods, addressing the complexities of
APA, Harvard, Vancouver, ISO, and other styles
7

Rekik, Karim, Zengpeng Zhou, Sohaib Ouzineb, Olfa Zened, and Myriam Amour. "Accelerating Wellbore Domain Interpretations Using Generative AI and Knowledge Graphs." In ADIPEC. SPE, 2024. http://dx.doi.org/10.2118/222164-ms.

Full text
Abstract:
Abstract The petroleum industry relies heavily on accurate wellbore domain interpretation for effective resource management and extraction. Traditionally, this process involves numerous complex and time-consuming steps, such as data normalization, unit harmonization, and depth matching, often requiring extensive expertise. This paper presents an AI-powered solution designed to automate the creation of wellbore interpretation workflows, significantly enhancing efficiency and accuracy. Our solution integrates generative AI, large language models (LLM), and graph-based methods to streamline the w
APA, Harvard, Vancouver, ISO, and other styles
8

Gogoi, U., M. Gupta, B. Bhatt, et al. "Automated Well Test Validation and Model Calibration for Enhanced Data Integrity and Operational Oversight." In SPE Conference at Oman Petroleum & Energy Show. SPE, 2025. https://doi.org/10.2118/225021-ms.

Full text
Abstract:
Abstract A streamlined process for automated well test validation and calibration is made possible through an interactive user interface, that standardizes vendor-specific well test reports and eliminates the need for manual template customization. Integration with database systems and a step-by-step comprehensive analysis of the well test validity and calibration, including notifications for continuous monitoring, enhance data quality, enabling field engineer oversight of well performance. This automated approach saves considerable time and effort compared to manual methods. Data sourced from
APA, Harvard, Vancouver, ISO, and other styles
9

Williams-Kovacs, J. D., D. Deryushkin, and C. R. Clarkson. "New Hybrid Workflow for Field Development Planning and Execution: Early Time Fracs vs Long Term Production." In SPE Hydraulic Fracturing Technology Conference and Exhibition. SPE, 2025. https://doi.org/10.2118/223544-ms.

Full text
Abstract:
Abstract Recently, a hybrid workflow for cost-effective and efficient field development planning and execution was introduced to accelerate learnings from post-frac data and to inform development decisions, without waiting for long-term production. This workflow combines typically gathered field data (i.e. geomechanics, flowback and short-term production) with existing integrated workflows to significantly reduce uncertainty and execution time associated with today’s asset development workflows. Although powerful, this workflow lacks ties with long-term production data and does not account for
APA, Harvard, Vancouver, ISO, and other styles
10

Bachi, Hana, Jianfa Wu, Chuxi Liu, et al. "An Efficient Hydraulic Fracture Geometry Calibration Workflow Using Microseismic Data." In SPE Oklahoma City Oil and Gas Symposium. SPE, 2023. http://dx.doi.org/10.2118/213085-ms.

Full text
Abstract:
Abstract Microseismic technology has proven its efficiency to monitor hydraulic fracturing effectiveness. The objective of this study is to develop a novel method to calibrate and generate the hydraulic fracture cluster-based model of a multi-stage horizontal shale well using the microseismic data. We use microcosmic numerical model known as Microseismic EDFM software feature (MSE-Frac) with the embedded discrete fracture model to simulate the hydraulic and natural fractures and the discrete fracture network. The MSE-Frac can handle the grouping of the clustered microcosmic events around the w
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Data Workflow Efficiency"

1

Salter, R., Quyen Dong, Cody Coleman, et al. Data Lake Ecosystem Workflow. Engineer Research and Development Center (U.S.), 2021. http://dx.doi.org/10.21079/11681/40203.

Full text
Abstract:
The Engineer Research and Development Center, Information Technology Laboratory’s (ERDC-ITL’s) Big Data Analytics team specializes in the analysis of large-scale datasets with capabilities across four research areas that require vast amounts of data to inform and drive analysis: large-scale data governance, deep learning and machine learning, natural language processing, and automated data labeling. Unfortunately, data transfer between government organizations is a complex and time-consuming process requiring coordination of multiple parties across multiple offices and organizations. Past succ
APA, Harvard, Vancouver, ISO, and other styles
2

Humpage, Sarah D. Benefits and Costs of Electronic Medical Records: The Experience of Mexico's Social Security Institute. Inter-American Development Bank, 2010. http://dx.doi.org/10.18235/0008829.

Full text
Abstract:
Electronic medical record (EMR) systems are increasingly used in developing countries to improve quality of care while increasing efficiency. There is little systematic evidence, however, regarding EMRs' benefits and costs. This case study documents the implementation and use of an EMR system at the Mexican Social Security Institute (IMSS). Three EMR systems are now in operation for primary care, outpatient and inpatient hospital care. The evidence suggests that the primary care system has improved efficiency of care delivery and human resources management, and may have decreased incidence of
APA, Harvard, Vancouver, ISO, and other styles
3

Raja, Rameez Ali, Vidushi Toshniwal, and Rodrigo Salgado. GIS-Based Geotechnical Database for Collaborative GIS. Purdue University, 2023. http://dx.doi.org/10.5703/1288284317637.

Full text
Abstract:
INDOT spends at least 8 million dollars annually on geotechnical site investigations, not including the amounts spent by contractors. The laborious and costly job of data collection in geotechnical practice requires the efficient storing and organizing of this valuable data to develop correlations and trends in spatially varying geotechnical data. INDOT currently uses gINT software for managing geotechnical data and ArcGIS for storing boring logs and geotechnical reports. The INDOT geotechnical office is pursuing means to improve the efficiency of their operations by developing a GIS-based geo
APA, Harvard, Vancouver, ISO, and other styles
4

Meloncelli, Daniel. Getting started with R and RStudio (Free Seminar). Instats Inc., 2025. https://doi.org/10.61700/5iuwjpqvm5oyo1560.

Full text
Abstract:
This seminar introduces R and RStudio, providing a hands-on guide to setting up and managing research projects efficiently. Participants will explore the RStudio interface, install and manage packages, create and organise an R project, and configure a working directory for reproducible workflows. The seminar covers importing, exploring, and preparing data, offering a hands-on experience with real-world datasets to provide the skills needed to execute efficient and reproducible data analysis projects.
APA, Harvard, Vancouver, ISO, and other styles
5

Steel, Piers, and Hadi Fariborzi. Beyond ChatGPT: Using AI to Enhance Research Workflows. Instats Inc., 2025. https://doi.org/10.61700/57x0xi7z30bh11498.

Full text
Abstract:
This one-day workshop equips PhD students and researchers with practical skills to integrate AI tools into their academic work effectively and ethically. Participants will gain hands-on experience with large language models, learning to enhance research quality and efficiency while navigating ethical considerations and leveraging AI for literature reviews, data analysis, and more.
APA, Harvard, Vancouver, ISO, and other styles
6

Verburg, Peter H., Žiga Malek, Sean P. Goodwin, and Cecilia Zagaria. The Integrated Economic-Environmental Modeling (IEEM) Platform: IEEM Platform Technical Guides: User Guide for the IEEM-enhanced Land Use Land Cover Change Model Dyna-CLUE. Inter-American Development Bank, 2021. http://dx.doi.org/10.18235/0003625.

Full text
Abstract:
The Conversion of Land Use and its Effects modeling framework (CLUE) was developed to simulate land use change using empirically quantified relations between land use and its driving factors in combination with dynamic modeling of competition between land use types. Being one of the most widely used spatial land use models, CLUE has been applied all over the world on different scales. In this document, we demonstrate how the model can be used to develop a multi-regional application. This means, that instead of developing numerous individual models, the user only prepares one CLUE model applica
APA, Harvard, Vancouver, ISO, and other styles
7

Ward, Logan Timothy, and Robert Errol Hackenberg. Efficient Workflows for Curation of Heterogeneous Data Supporting Modeling of U-Nb Alloy Aging. Office of Scientific and Technical Information (OSTI), 2016. http://dx.doi.org/10.2172/1324576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Fuentes, Anthony, Michelle Michaels, and Sally Shoop. Methodology for the analysis of geospatial and vehicle datasets in the R language. Cold Regions Research and Engineering Laboratory (U.S.), 2021. http://dx.doi.org/10.21079/11681/42422.

Full text
Abstract:
The challenge of autonomous off-road operations necessitates a robust understanding of the relationships between remotely sensed terrain data and vehicle performance. The implementation of statistical analyses on large geospatial datasets often requires the transition between multiple software packages that may not be open-source. The lack of a single, modular, and open-source analysis environment can reduce the speed and reliability of an analysis due to an increased number of processing steps. Here we present the capabilities of a workflow, developed in R, to perform a series of spatial and
APA, Harvard, Vancouver, ISO, and other styles
9

Barter, Rebecca. Python for R Users. Instats Inc., 2024. http://dx.doi.org/10.61700/g1u72h8m3dbuw1676.

Full text
Abstract:
This workshop is designed to help researchers fluent in R, particularly with the tidyverse, transition easily to Python, enhancing their data analysis skills across a range of fields. Participants will gain a robust understanding of Python's libraries and data structures, enabling them to integrate Python into their research workflows efficiently and expand their analytical capabilities, including with Python's substantial toolkits for machine learning and deep learning such as Scikit-learn and Keras.
APA, Harvard, Vancouver, ISO, and other styles
10

Downard, Alicia, Stephen Semmens, and Bryant Robbins. Automated characterization of ridge-swale patterns along the Mississippi River. Engineer Research and Development Center (U.S.), 2021. http://dx.doi.org/10.21079/11681/40439.

Full text
Abstract:
The orientation of constructed levee embankments relative to alluvial swales is a useful measure for identifying regions susceptible to backward erosion piping (BEP). This research was conducted to create an automated, efficient process to classify patterns and orientations of swales within the Lower Mississippi Valley (LMV) to support levee risk assessments. Two machine learning algorithms are used to train the classification models: a convolutional neural network and a U-net. The resulting workflow can identify linear topographic features but is unable to reliably differentiate swales from o
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!