Literatura académica sobre el tema "Data-Independent Acquisition (DIA)"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Data-Independent Acquisition (DIA)".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Data-Independent Acquisition (DIA)"

1

Hu, Alex, William S. Noble y Alejandro Wolf-Yadlin. "Technical advances in proteomics: new developments in data-independent acquisition". F1000Research 5 (31 de marzo de 2016): 419. http://dx.doi.org/10.12688/f1000research.7042.1.

Texto completo
Resumen
The ultimate aim of proteomics is to fully identify and quantify the entire complement of proteins and post-translational modifications in biological samples of interest. For the last 15 years, liquid chromatography-tandem mass spectrometry (LC-MS/MS) in data-dependent acquisition (DDA) mode has been the standard for proteomics when sampling breadth and discovery were the main objectives; multiple reaction monitoring (MRM) LC-MS/MS has been the standard for targeted proteomics when precise quantification, reproducibility, and validation were the main objectives. Recently, improvements in mass spectrometer design and bioinformatics algorithms have resulted in the rediscovery and development of another sampling method: data-independent acquisition (DIA). DIA comprehensively and repeatedly samples every peptide in a protein digest, producing a complex set of mass spectra that is difficult to interpret without external spectral libraries. Currently, DIA approaches the identification breadth of DDA while achieving the reproducible quantification characteristic of MRM or its newest version, parallel reaction monitoring (PRM). In comparative de novo identification and quantification studies in human cell lysates, DIA identified up to 89% of the proteins detected in a comparable DDA experiment while providing reproducible quantification of over 85% of them. DIA analysis aided by spectral libraries derived from prior DIA experiments or auxiliary DDA data produces identification and quantification as reproducible and precise as that achieved by MRM/PRM, except on low‑abundance peptides that are obscured by stronger signals. DIA is still a work in progress toward the goal of sensitive, reproducible, and precise quantification without external spectral libraries. New software tools applied to DIA analysis have to deal with deconvolution of complex spectra as well as proper filtering of false positives and false negatives. However, the future outlook is positive, and various researchers are working on novel bioinformatics techniques to address these issues and increase the reproducibility, fidelity, and identification breadth of DIA.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Pino, Lindsay K., Seth C. Just, Michael J. MacCoss y Brian C. Searle. "Acquiring and Analyzing Data Independent Acquisition Proteomics Experiments without Spectrum Libraries". Molecular & Cellular Proteomics 19, n.º 7 (20 de abril de 2020): 1088–103. http://dx.doi.org/10.1074/mcp.p119.001913.

Texto completo
Resumen
Data independent acquisition (DIA) is an attractive alternative to standard shotgun proteomics methods for quantitative experiments. However, most DIA methods require collecting exhaustive, sample-specific spectrum libraries with data dependent acquisition (DDA) to detect and quantify peptides. In addition to working with non-human samples, studies of splice junctions, sequence variants, or simply working with small sample yields can make developing DDA-based spectrum libraries impractical. Here we illustrate how to acquire, queue, and validate DIA data without spectrum libraries, and provide a workflow to efficiently generate DIA-only chromatogram libraries using gas-phase fractionation (GPF). We present best-practice methods for collecting DIA data using Orbitrap-based instruments and develop an understanding for why DIA using an Orbitrap mass spectrometer should be approached differently than when using time-of-flight instruments. Finally, we discuss several methods for analyzing DIA data without libraries.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Shah, Syed Muhammad Zaki, Arslan Ali, Muhammad Noman Khan, Adeeba Khadim, Mufarreh Asmari, Jalal Uddin y Syed Ghulam Musharraf. "Sensitive Detection of Pharmaceutical Drugs and Metabolites in Serum Using Data-Independent Acquisition Mass Spectrometry and Open-Access Data Acquisition Tools". Pharmaceuticals 15, n.º 7 (21 de julio de 2022): 901. http://dx.doi.org/10.3390/ph15070901.

Texto completo
Resumen
Data-independent acquisition (DIA) based strategies have been explored in recent years for improving quantitative analysis of metabolites. However, the data analysis is challenging for DIA methods as the resulting spectra are highly multiplexed. Thus, the DIA mode requires advanced software analysis to facilitate the data deconvolution process. We proposed a pipeline for quantitative profiling of pharmaceutical drugs and serum metabolites in DIA mode after comparing the results obtained from full-scan, Data-dependent acquisition (DDA) and DIA modes. using open-access software. Pharmaceutical drugs (10) were pooled in healthy human serum and analysed by LC-ESI-QTOF-MS. MS1 full-scan and Data-dependent (MS2) results were used for identification using MS-DIAL software while deconvolution of MS1/MS2 spectra in DIA mode was achieved by using Skyline software. The results of acquisition methods for quantitative analysis validated the remarkable analytical performance of the constructed workflow, proving it to be a sensitive and reproducible pipeline for biological complex fluids.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Barbier Saint Hilaire, Pierre, Kathleen Rousseau, Alexandre Seyer, Sylvain Dechaumet, Annelaure Damont, Christophe Junot y François Fenaille. "Comparative Evaluation of Data Dependent and Data Independent Acquisition Workflows Implemented on an Orbitrap Fusion for Untargeted Metabolomics". Metabolites 10, n.º 4 (18 de abril de 2020): 158. http://dx.doi.org/10.3390/metabo10040158.

Texto completo
Resumen
Constant improvements to the Orbitrap mass analyzer, such as acquisition speed, resolution, dynamic range and sensitivity have strengthened its value for the large-scale identification and quantification of metabolites in complex biological matrices. Here, we report the development and optimization of Data Dependent Acquisition (DDA) and Sequential Window Acquisition of all THeoretical fragment ions (SWATH-type) Data Independent Acquisition (DIA) workflows on a high-field Orbitrap FusionTM TribridTM instrument for the robust identification and quantification of metabolites in human plasma. By using a set of 47 exogenous and 72 endogenous molecules, we compared the efficiency and complementarity of both approaches. We exploited the versatility of this mass spectrometer to collect meaningful MS/MS spectra at both high- and low-mass resolution and various low-energy collision-induced dissociation conditions under optimized DDA conditions. We also observed that complex and composite DIA-MS/MS spectra can be efficiently exploited to identify metabolites in plasma thanks to a reference tandem spectral library made from authentic standards while also providing a valuable data resource for further identification of unknown metabolites. Finally, we found that adding multi-event MS/MS acquisition did not degrade the ability to use survey MS scans from DDA and DIA workflows for the reliable absolute quantification of metabolites down to 0.05 ng/mL in human plasma.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Lu, Yang Young, Jeff Bilmes, Ricard A. Rodriguez-Mias, Judit Villén y William Stafford Noble. "DIAmeter: matching peptides to data-independent acquisition mass spectrometry data". Bioinformatics 37, Supplement_1 (1 de julio de 2021): i434—i442. http://dx.doi.org/10.1093/bioinformatics/btab284.

Texto completo
Resumen
Abstract Motivation Tandem mass spectrometry data acquired using data independent acquisition (DIA) is challenging to interpret because the data exhibits complex structure along both the mass-to-charge (m/z) and time axes. The most common approach to analyzing this type of data makes use of a library of previously observed DIA data patterns (a ‘spectral library’), but this approach is expensive because the libraries do not typically generalize well across laboratories. Results Here, we propose DIAmeter, a search engine that detects peptides in DIA data using only a peptide sequence database. Although some existing library-free DIA analysis methods (i) support data generated using both wide and narrow isolation windows, (ii) detect peptides containing post-translational modifications, (iii) analyze data from a variety of instrument platforms and (iv) are capable of detecting peptides even in the absence of detectable signal in the survey (MS1) scan, DIAmeter is the only method that offers all four capabilities in a single tool. Availability and implementation The open source, Apache licensed source code is available as part of the Crux mass spectrometry analysis toolkit (http://crux.ms). Supplementary information Supplementary data are available at Bioinformatics online.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Nijssen, Rosalie, Marco H. Blokland, Robin S. Wegh, Erik de Lange, Stefan P. J. van Leeuwen, Bjorn J. A. Berendsen y Milou G. M. van de Schans. "Comparison of Compound Identification Tools Using Data Dependent and Data Independent High-Resolution Mass Spectrometry Spectra". Metabolites 13, n.º 7 (21 de junio de 2023): 777. http://dx.doi.org/10.3390/metabo13070777.

Texto completo
Resumen
Liquid chromatography combined with high-resolution mass spectrometry (LC-HRMS) is a frequently applied technique for suspect screening (SS) and non-target screening (NTS) in metabolomics and environmental toxicology. However, correctly identifying compounds based on SS or NTS approaches remains challenging, especially when using data-independent acquisition (DIA). This study assessed the performance of four HRMS-spectra identification tools to annotate in-house generated data-dependent acquisition (DDA) and DIA HRMS spectra of 32 pesticides, veterinary drugs, and their metabolites. The identification tools were challenged with a diversity of compounds, including isomeric compounds. The identification power was evaluated in solvent standards and spiked feed extract. In DDA spectra, the mass spectral library mzCloud provided the highest success rate, with 84% and 88% of the compounds correctly identified in the top three in solvent standard and spiked feed extract, respectively. The in silico tools MSfinder, CFM-ID, and Chemdistiller also performed well in DDA data, with identification success rates above 75% for both solvent standard and spiked feed extract. MSfinder provided the highest identification success rates using DIA spectra with 72% and 75% (solvent standard and spiked feed extract, respectively), and CFM-ID performed almost similarly in solvent standard and slightly less in spiked feed extract (72% and 63%). The identification success rates for Chemdistiller (66% and 38%) and mzCloud (66% and 31%) were lower, especially in spiked feed extract. The difference in success rates between DDA and DIA is most likely caused by the higher complexity of the DIA spectra, making direct spectral matching more complex. However, this study demonstrates that DIA spectra can be used for compound annotation in certain software tools, although the success rate is lower than for DDA spectra.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Wang, Jian, Monika Tucholska, James D. R. Knight, Jean-Philippe Lambert, Stephen Tate, Brett Larsen, Anne-Claude Gingras y Nuno Bandeira. "MSPLIT-DIA: sensitive peptide identification for data-independent acquisition". Nature Methods 12, n.º 12 (9 de noviembre de 2015): 1106–8. http://dx.doi.org/10.1038/nmeth.3655.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Koopmans, Frank, Jenny T. C. Ho, August B. Smit y Ka Wan Li. "Comparative Analyses of Data Independent Acquisition Mass Spectrometric Approaches: DIA, WiSIM-DIA, and Untargeted DIA". PROTEOMICS 18, n.º 1 (enero de 2018): 1700304. http://dx.doi.org/10.1002/pmic.201700304.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Fierro-Monti, Ivo, Klemens Fröhlich, Christian Schori y Alexander Schmidt. "Assessment of Data-Independent Acquisition Mass Spectrometry (DIA-MS) for the Identification of Single Amino Acid Variants". Proteomes 12, n.º 4 (6 de noviembre de 2024): 33. http://dx.doi.org/10.3390/proteomes12040033.

Texto completo
Resumen
Proteogenomics integrates genomic and proteomic data to elucidate cellular processes by identifying variant peptides, including single amino acid variants (SAAVs). In this study, we assessed the capability of data-independent acquisition mass spectrometry (DIA-MS) to identify SAAV peptides in HeLa cells using various search engine pipelines. We developed a customised sequence database (DB) incorporating SAAV sequences from the HeLa genome and conducted searches using DIA-NN, Spectronaut, and Fragpipe-MSFragger. Our evaluation focused on identifying true positive SAAV peptides and false positives through entrapment DBs. This study revealed that DIA-MS provides reproducible and comprehensive coverage of the proteome, identifying a substantial proportion of SAAV peptides. Notably, the DIA-MS searches maintained consistent identification of SAAV peptides despite varying sizes of the entrapment DB. A comparative analysis showed that Fragpipe-MSFragger (FP-DIA) demonstrated the most conservative and effective performance, exhibiting the lowest false discovery match ratio (FDMR). Additionally, integrating DIA and data-dependent acquisition (DDA) MS data search outputs enhanced SAAV peptide identification, with a lower false discovery rate (FDR) observed in DDA searches. The validation using stable isotope dilution and parallel reaction monitoring (SID-PRM) confirmed the SAAV peptides identified by DIA-MS and DDA-MS searches, highlighting the reliability of our approach. Our findings underscore the effectiveness of DIA-MS in proteogenomic workflows for identifying SAAV peptides, offering insights into optimising search engine pipelines and DB construction for accurate proteomics analysis. These methodologies advance the understanding of proteome variability, contributing to cancer research and the identification of novel proteoform therapeutic targets.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Ozorun, Gulsev, Alexander Eckersley, Eleanor Bradley, Rachel Watson, Michael Sherrat y Joe Swift. "P28 Data-independent acquisition mass spectrometry improves spatially resolved analysis of the human skin proteome". British Journal of Dermatology 190, n.º 6 (17 de mayo de 2024): e92-e92. http://dx.doi.org/10.1093/bjd/ljae105.050.

Texto completo
Resumen
Abstract Introduction and aims Proteomic analysis of the extracellular matrix (ECM) presents challenges because of the highly crosslinked and low-solubility nature of ECM proteins. Traditional homogenization and protein digestion approaches result in the loss of crucial information regarding protein localization and spatial relationships. To address this, spatially resolved proteomics emerges as a powerful tool for exploring heterogeneity within bulk tissues. This study aims to determine the minimum tissue volume required for comprehensive proteome coverage using data-independent acquisition mass spectrometry (DIA-MS) on skin tissue. The study focused on optimizing spatially resolved proteomic techniques to enhance depth-of-analysis while preserving spatial specificity. Methods Human abdominal skin biopsies were obtained from a single individual and subsequently cryosectioned. Histological assessment was performed through haematoxylin and eosin staining for visualization purposes. Laser-capture microdissection coupled with mass spectrometry facilitated the precise isolation of target regions. Comparative analyses were performed between data-dependent acquisition mass spectrometry (DDA-MS) and DIA-MS, with a particular emphasis on ECM proteins within the dermis. Results Our findings revealed an improvement in proteome coverage with DIA-MS compared with DDA-MS, in addition to clear scaling relationships between the depth-of-analysis and sample concentration. The Results demonstrated the superiority of DIA-MS in achieving robust and comprehensive proteomic profiles, even with minimal tissue volumes. Preliminary findings suggest the capability of DIA-MS in elucidating the complexities of the skin proteome with spatial resolution. Conclusions In conclusion, our study highlights the efficacy of DIA-MS in spatially resolved proteomics on skin tissue. The optimized approach presented here offers a reliable and efficient method for obtaining in-depth proteome information with minimal tissue requirements. These Results form the foundation for ongoing experiments, utilizing DIA-MS to advance spatially resolved proteomic analyses of human skin.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Más fuentes

Tesis sobre el tema "Data-Independent Acquisition (DIA)"

1

De, Lama Valderrama Noelia Milagros. "Development of new mass spectrometry methods for the characterization of protein impurities in therapeutic antibodies". Electronic Thesis or Diss., Strasbourg, 2025. http://www.theses.fr/2025STRAF008.

Texto completo
Resumen
Les protéines de cellule hôte (HCPs) sont des impuretés indésirables dans la production d’anticorps monoclonaux (mAbs), pouvant compromettre la sécurité, l’efficacité et la stabilité des traitements. Bien que l’ELISA soit couramment utilisée, elle présente des limites de couverture. Cette thèse explore des méthodes complémentaires basées sur la spectrométrie de masse. Une approche d’immuno-capture permet de détecter les HCPs non immune-réactifs, tandis que des workflows LC-MS/MS avancés avec des peptide standards offrent une quantification plus précise. Ces stratégies visent à améliorer le contrôle qualité dans la fabrication des mAbs
Host cell proteins (HCPs) are unwanted by-products in the production of monoclonal antibodies (mAbs), and even at low levels, they can affect the safety, efficacy, and stability of biopharmaceuticals. While ELISA is widely used for HCP detection, it lacks full impurity coverage. This work explores complementary mass spectrometry-based methods to address these limitations. An immune-capture MS approach targets non-immunoreactive HCPs missed by ELISA, while advanced LC-MS/MS workflows using peptide standards enable more accurate and flexible quantification. These tools aim to improve impurity profiling and strengthen quality control in mAb manufacturing
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Moreira, Tiago do Carmo Santos Soares. "Targeted, semi-targeted and non-targeted screening for drugs in whole blood by UHPLC-TOF-MS with data-independent acquisition (DIA)". Master's thesis, 2014. https://repositorio-aberto.up.pt/handle/10216/80124.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Moreira, Tiago do Carmo Santos Soares. "Targeted, semi-targeted and non-targeted screening for drugs in whole blood by UHPLC-TOF-MS with data-independent acquisition (DIA)". Dissertação, 2014. https://repositorio-aberto.up.pt/handle/10216/80124.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Data-Independent Acquisition (DIA)"

1

Casavant, Ellen P., Jason Liang, Sumedh Sankhe, W. Rodney Mathews y Veronica G. Anania. "Using SILAC to Develop Quantitative Data-Independent Acquisition (DIA) Proteomic Methods". En Methods in Molecular Biology, 245–57. New York, NY: Springer US, 2022. http://dx.doi.org/10.1007/978-1-0716-2863-8_20.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Molzahn, Cristen, Lorenz Nierves, Philipp F. Lange y Thibault Mayor. "Isolation of Detergent Insoluble Proteins from Mouse Brain Tissue for Quantitative Analysis Using Data Independent Acquisition (DIA)". En Methods in Molecular Biology, 29–51. New York, NY: Springer US, 2022. http://dx.doi.org/10.1007/978-1-0716-2124-0_3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Holtz, Anja, Nathan Basisty y Birgit Schilling. "Quantification and Identification of Post-Translational Modifications Using Modern Proteomics". En Methods in Molecular Biology, 225–35. New York, NY: Springer US, 2021. http://dx.doi.org/10.1007/978-1-0716-1024-4_16.

Texto completo
Resumen
AbstractPost-translational modifications (PTMs) occur dynamically, allowing cells to quickly respond to changes in the environment. Lysine residues can be targeted by several modifications including acylations (acetylation, succinylation, malonylation, glutarylation, and others), methylation, ubiquitination, and other modifications. One of the most efficient methods for the identification of post-translational modifications is utilizing immunoaffinity enrichment followed by high-resolution mass spectrometry. This workflow can be coupled with comprehensive data-independent acquisition (DIA) mass spectrometry to be a high-throughput, label-free PTM quantification approach. Below we describe a detailed protocol to process tissue by homogenization and proteolytically digest proteins, followed by immunoaffinity enrichment of lysine-acetylated peptides to identify and quantify relative changes of acetylation comparing different conditions.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Díaz-Peña, Ramón, Pol Andrés-Benito, Erica Peral, Raúl Domínguez, Mónica Povedano, Enrique Santamaría y Joaquín Fernández-Irigoyen. "High-Throughput Human Cerebrospinal Fluid Proteome Analysis with Direct Data-Independent Acquisition (dDIA)". En Methods in Molecular Biology, 129–39. New York, NY: Springer US, 2025. https://doi.org/10.1007/978-1-0716-4462-1_11.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Sharma, Shweta. "Recent Trends of Modern Mass Spectrometry: Application towards Drug Discovery and Development Process". En Applications of Modern Mass Spectrometry volume 2, 209–24. BENTHAM SCIENCE PUBLISHERS, 2024. http://dx.doi.org/10.2174/9789815050059124020008.

Texto completo
Resumen
Mass spectrometry has evolved significantly in recent years and has become a powerful analytical tool in the field of drug discovery and development. It allows for the identification and characterization of small molecules, peptides, and proteins in complex biological samples with high sensitivity and accuracy. This chapter provides an overview of the recent trends in modern mass spectrometry and its application towards the drug discovery and development process. It discusses the advancements in mass spectrometry technology, such as high-resolution mass spectrometry (HRMS), ambient ionization mass spectrometry (AIMS), data-independent acquisition (DIA) mass spectrometry, tandem mass spectrometry (LC-MS/MS), and how they have enabled the analysis of complex biological samples. The chapter also highlights the use of mass spectrometry in various stages of the drug discovery and development process, including target identification, hit identification, lead optimization, and drug metabolism and pharmacokinetic studies. Additionally, it discusses the challenges and future prospects of mass spectrometry in drug discovery and development. Overall, mass spectrometry has revolutionized the drug discovery and development process and will continue to play a crucial role in the future.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Data-Independent Acquisition (DIA)"

1

Menon, Biju, Sandeep Fernandes y Sasidharan Adiyodi Kenoth. "Data Acquisition and Monitoring System Through Existing Communication Network". En GOTECH. SPE, 2025. https://doi.org/10.2118/224589-ms.

Texto completo
Resumen
Abstract Data collection and analysis is vital for evaluating the performance of any equipment or system. Though the methodology of collection, transmission, storage and retrieval varies in different scenarios, the basic components of the set-up remain the same. The Data Acquisition and Monitoring System evolved from the need for remote monitoring of facilities that did not have an independent reliable infrastructure to implement a full-fledged SCADA system. Back when IT infrastructure was prioritized for business communications within the limited bandwidth allocated, collection and transmission of process data through the available network infrastructure was a challenge. More so when there are concerns regarding the confidentiality of data being collected and transmitted. Individual facilities were provided with local instrumentation and controls for reliable operations. With the growing production rates, the focus generally goes for improving the processing facilities rather than reading, reporting or recording. Instead of an independent dedicated network for Supervisory Control and Data Acquisition system that mandate a huge infrastructural cost and dedicated skilled personnel, a pilot project was initiated with a minimum budget to test data transfer through existing communication network. Field transmitters were installed on Wellheads, Risers and Manifolds and Remote Terminal Units were provided to collect data from two platforms and to facilitate data transfer to the servers through existing microwave communication network with minimum field instrumentation. Considering bandwidth limitations and priority for business communications, servers in redundant configuration were placed strategically to ensure continuous data gathering. Pilot project was successful in providing reliable data and trends with minimum disruptions. Augmentation of DAMS network was envisioned to implement Digital Oil Field for monitoring multiple assets in real time. Implementation of Digital Oil Field continued with the addition of new field instrumentation and RTUs followed by server upgrades and integration of PLC based platform control systems to add four major producing platforms to the Data Acquisition and Monitoring System through existing microwave communication network. Upon successful commissioning, it was possible to gather data from various wellheads, process equipment and other independent meters for remote real time monitoring and analysis through user friendly HMI screens and customized dashboards to manage multiple assets effectively, improving performance evaluation, safety and reliability of operations.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Kolonic, Sadat, Maneesh Pisharat, Josef Schachner, Richard Shipp, Hemmo Bosscher, Pim Van Bergen y Olaf Podlaha. "Operationalization of Advanced Mud Gas Logging in Development Drilling: Examples From the Recent HPHT Infill Campaign in the Central North Sea". En 2023 SPWLA 64th Annual Symposium. Society of Petrophysicists and Well Log Analysts, 2023. http://dx.doi.org/10.30632/spwla-2023-0041.

Texto completo
Resumen
Standard mud gas logging has served the drill-engineering discipline foremost in executing safe well delivery. Additional subsurface insights are often considered less important when commissioning this service. Consequently, standard mud gas (SMG) logging remains routine despite the advances in quantifiable advanced mud gas (AMG) logging capability. Such advances make it more operationally feasible to deploy AMG and thereby markedly enhance the acquired subsurface insights. This was demonstrated during a recent high-pressure/high-temperature (HPHT) infill campaign in the Central North Sea (CNS). Wells targeting deep Jurassic formations have used AMG technology for continuous compositional analysis while drilling. For a mature field experiencing production-related changes to reservoir fluid, the main objective of collecting AMG data is to aid early assessment of downhole hydrocarbon variability. Operationally this is being performed while drilling in liner (DIL) and in the absence of logging while drilling (LWD). For example, identifying reservoir tops, fluid dissimilarities, and an independent saturation flag is critical operational information. These help to guide decisions on completion strategy and logging behind casing, which in turn aids rig time optimization and offsets the deployment costs. Post-drill systematic integration with other geochemistry data (e.g., gas isotopes, mineralogy, and kerogen compositions) and independent petrophysical techniques (such as triple combo) enables the identification of possible missed pay zones furthermore. Once the “field” is calibrated, the AMG data increase fluid phase interpretation confidence in support of near-time operational decisions and overall reservoir management. An example is the confirmation of new flow unit contributors to perforations for future well interventions/abandonment consideration. Further value upside and differentiation are achieved by collecting the AMG data across the overburdened chalk. The latter provided the first-time in-field granularity on chalk fluid facies, reservoir architecture, and connectivity. In addition, we highlight the added value of information, practical applicability, and consideration for future ultradeep HPHT developments. We advocate the increasing feasibility and appropriateness of progressing AMG to a more routine deployment state in similar field settings and beyond. In the medium term, the quantitative mud gas records acquired by continuous physical sampling may further improve our understanding of vertical fluid evolution in the present-day overburden. Understanding this deep subsurface sediment-(hydro)carbon, i.e., rock-fluid interactions, offers additional potential subsurface solutions. Effects such as active cycling of carbon-bearing phases during fluid migration under post-burial prereservoir conditions could be addressed. These remain challenging in carbon capture underground storage (CCUS) project implementation. 1. Support collection of AMG data in a brownfield to aid early assessment of downhole hydrocarbon fingerprinting while DIL and in the absence of LWD log in the HPHT environment 2. Support the identification of wellbore breathing, reservoir tops, fluid properties, and independent saturation flag to aid decisions on completion strategy and behind-the-casing logging 3. Assessment of chalk fluid facies, reservoir architecture, and connectivity in the field AMG samples the mud at the surface, extracts light hydrocarbon, and quantifies the composition. Three data processing steps convert measured hydrocarbon into gas volume per unit of rock drilled. They are corrected for extractor response, contamination, and volume changes due to variations in drilling parameters. The corrected data are used for compositional analysis, identification of pay zones, and deriving saturation flags. AMG has proven to be a pragmatic, independent additional fluid assessment technology tool during this infill campaign. It carried low operational risk compared to downhole logging/sampling in HPHT. It has proven an inexpensive methodology to maximize data acquisition outside the primary reservoir objective at a minimum cost. Hence the recommendation is to employ this technology as standard with additional benefits in the absence of not being able to acquire logging data. Systematic and routine AMG in a mature field development drilling may thus far prove to be a means of an inexpensive pseudo-production logging tool (PLT) analyzing dynamic filed performance and determining the zonal contribution (in case of co-mingled stacked sands or multiple pays or swept zones) in the total production. Detecting fluid (dis)similarities and linking these to subseismic faulting or juxtaposition would further allow corroborating 4D seismic interpretations and aid infill drilling strategy. Furthermore, amendments to well trajectory/well placement for improved sweeping efficiency, section TDs/casing shoe depth (gas cap expansion), completions, or front-end design are some examples of effective mitigation of downside risk contribution through improved fluid understanding from AMG if deployed routinely on infill wells.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Nordin, T. B., L. W. Ong, D. Saadon y A. B. Yaakob. "Enhancing Reservoir Insight with Real Time LWD Azimuthal Density Imaging, Wireline Advanced Log Acquisition & Core/Cuttings Analysis: a Case Study of Complex Lithology in Deepwater Sabah, Malaysia". En International Petroleum Technology Conference. IPTC, 2025. https://doi.org/10.2523/iptc-25064-ea.

Texto completo
Resumen
Abstract This paper presents a case study of evaluating deepwater exploration well with complex lithology and no suitable nearby reference/analog wells by integrating Core/Cuttings analysis (utilizing XRD/XRF), Wireline Formation Tester, Advanced Well Logging (3D Nuclear Magnetic Resonance & Resistivity Image Logs) and real-time LWD azimuthal density imaging. Deepwater exploration well X-1 was drilled with target carbonate reservoir as objective & the studied field, located offshore deepwater Sabah, Malaysia. In this well, the data acquisition program was planned according to expected carbonate prospect with potential complex lithology. Advance wireline logging tools Nuclear magnetic resonance (3D NMR), water-based mud borehole image, formation tester with advanced DFA (downhole fluid analyzer) as well as drilling cuttings real-time advance analysis (XRD/XRF) was part of the data acquisition program. The well encountered complex lithology instead of conventional carbonate. The synergic use of advance logging tools data with the aid of real-time LWD azimuthal density imaging & cuttings analysis XRD/XRF allowed for identification of lithology with formation bearing fluids, reservoir quality, invasion, and nearly saturated formation fluids with complex mineralogy. Applying conventional evaluation approach without thorough integration of advanced wireline log data with processing workflow would not achieve the formation evaluation objectives of the complex lithology. The integrated workflow delivered robust interpretation helps reduce uncertainties in answering complex reservoir characterization. Key input data for petrophysical evaluation of the well revolves around the 3D Nuclear magnetic resonance, Image logs and real-time well cutting with advance analysis (XRD/XRF). A new wireline Nuclear magnetic resonance (NMR) logging has been routinely used to measure T1, T2 and Diffusion in addition to mineralogy independent porosity, irreducible water saturation, and permeability indication of reservoir. The T1 and T2 distributions inverted from NMR logging data are often composed of several fluid type components. Each fluid type component can be represented by a uniqueT1 and T2 peaks in the T1 and T2 distribution. This paper first discusses the application of NMR fluid characterization using T1-T2-Diffusion (3D model) & T1-T2 and D-T2 (2D models) from latest generation multi frequencies NMR tool. The (T2-T1) method increases the contrast between gas and oil, distinguishing gas from the wide viscosity range of oil. The (D-T2) method utilized diffusion analysis to identify oil and water. The water-based resistivity image log is acquired with borehole coverage of approximately 60% in an 8.5-inch hole, with vertical resolution down to 5 mm. The image log raw data was processed with speed correction, eccentricating correction, and normalization. The post processed high resolution resistivity images were used for sedimentary interpretation together with LWD azimuthal density images.This paper also describes a method for assessing complex reservoir lithology that uses onsite XRD/XRF analysis on drill cuttings, samples, and sidewall core that are obtained during well drilling. As compared to submitting to an outside lab, on-site XRD/XRF analysis yields faster findings and costs less. Complex lithology/volcanic can host significant hydrocarbon resources but are poorly understood in terms of their reservoir properties: especially their porosity and permeability characteristics. Despite low permeability, volcanic rocks are good hydrocarbon prospects and understanding the distribution of these key internal zones and their characteristic porosity and permeability will enhance hydrocarbon exploration within these unconventional reservoirs. In summary, the integration of available multi-disciplinary data (latest generation of advanced wireline logging and LWD data) and the application of new in-house integrated workflow and solid geosciences knowledge added value to our further understanding of this surrounding area, especially in deepwater Sabah, Malaysia.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Ihara, Masayuki, Hiroko Tokunaga, Hiroki Murakami, Shinpei Saruwatari, Akihiko Koga, Takashi Yukihira, Shinya Hisano, Kazuki Takeshita, Ryoichi Maeda y Masashige Motoe. "Human values assessment toward AI-based patient state prediction". En Human Interaction and Emerging Technologies (IHIET-AI 2024). AHFE International, 2024. http://dx.doi.org/10.54941/ahfe1004600.

Texto completo
Resumen
Recent advances in artificial intelligence (AI) technology are remarkable, and AI may predict a patient’s future state (prognosis) using large amounts of data in the future. However, patients are not always satisfied with the prediction results. In order for a patient to accept prediction results and change behaviors in life, prediction results should reflect his/her values. AI-based patient state prediction should be implemented in a way that is not only data-driven but also integrated with domain knowledge about human values. This paper reports a case study where we assessed data related to values toward building a domain knowledge for AI-based patient state prediction.We designed a rehabilitation exercise service based on a person-centered principle [1] and intervened with one patient [2]. From the preliminary stage of service design, we attempted to build rapport and motivate her. We expected her to self-disclose through repeated dialogues. While understanding her life backgrounds and values, we defined rehabilitation goals for her independent living.12 rehabilitation exercise sessions were held for three months, and questionnaires and semi-structured interviews were conducted before the first session and after the final session. Even after the experiment ended, we continued to have conversations with her and obtained information about her life after the experiment ended. Measurements taken after the experiment showed no noticeable effect on her physical functions. Regarding her self-disclosure, as the experiment progressed, she revealed more of her true feelings, including negative ones. Before the experiment, she negatively commented about daily activities that she could not do well, but after the experiment, she positively did with focusing on the things she could do, although she was not perfect. We confirmed her active participation in social activities, such as going out wearing a clothe made by sewing, a task that she had viewed negatively before the experiment. Factors that influenced her behavior changes include building rapport through repeated conversations and self-disclosure as an effect of the experimenter’s interest in and empathy for her. Regarding self-disclosure, at the beginning of the experiment, most disclosures were positive-sounding superficial ones, but gradually they changed to deeper disclosures that included negative content.The patient’s value data suggested by the self-disclosure in this study is still insufficient to form a value domain knowledge system. However, we believe that the acquisition of value data based on a patient’s self-disclosure and behavioral changes will help realize AI state prediction that will lead to patient acceptance in the future. Future work will include a detailed analysis of the factors that influenced self-disclosure and the construction of a value assessment framework.[1] Kitwood, T. and Bredin, K. (1992) Towards a theory of dementia care: Personhood and well-being, Ageing and Society, Vol.12, No.3, pp.269-287.[2] Yukihira, T et al. (2023). Toward an online rehabilitation exercise service based on personal independent living goals and risk management. Human Systems Engineering and Design (IHSED 2023): Future Trends and Applications, Vol. 112, pp.187-194.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Naveena-Chandran, Rohin, Farrukh Hamza, Gibran Hashmi, Jason Rogers, John Meyer y Sara Chapman. "ENHANCING THE UNDERSTANDING OF ASPHALTENE PRECIPITATION: A NOVEL APPROACH UNDER IN-SITU CONDITIONS". En 2021 SPWLA 62nd Annual Logging Symposium Online. Society of Petrophysicists and Well Log Analysts, 2021. http://dx.doi.org/10.30632/spwla-2021-0012.

Texto completo
Resumen
Flow assurance is a vital challenge that affects the viability of an asset in all oil producing environments. A proper understanding of asphaltene precipitation leading to deposition lends itself to reliable completions planning and timely remediation efforts. This ultimately dictates the production life of the reservoir. The Wireline Formation Tester (WFT) has traditionally aided the understanding of asphaltene composition in reservoir fluids through the collection of pressurized fluid samples. Moreover, the use of Downhole Fluid Analysis (DFA) during a fluid pumpout has augmented the understanding of soluble asphaltenes under in-situ flowing conditions. However, an accurate and representative measurement of Asphaltene Onset Pressure (AOP) has eluded the industry. Traditionally, this measurement has been determined post-acquisition through different laboratory techniques performed on a restored fluid sample. Although sound, there are inherent challenges that affect the quality of the results. These challenges primarily include the need to restore samples to reservoir conditions, maintaining samples at equilibrium composition, and the destruction of fluid samples through inadvertent asphaltene precipitation during transporting and handling. Hence, there is a need for WFT operations to deliver a source of reliable analysis, particularly in high-pressure/high-temperature (HP/HT) reservoirs, to avoid costly miscalculations. A premiere industry method to determine AOP under in-situ producible conditions is presented. Demonstrated in a Gulf of Mexico (GOM) reservoir, this novel technique mimics the gravimetric and light scattering methods, where a fluid sample is isothermally depressurized from initial reservoir pressure; simultaneously, DFA monitors asphaltene precipitation from solution and a high-precision pressure gauge records the onset of asphaltene precipitation. This measurement is provided continuously and in real time. An added advantage is that experiments are performed individually after obtaining a pressurized sample in distinct oil zones. Therefore, the execution of this downhole AOP experiment is independent of an already captured fluid sample and does not impact the quality of any later laboratory-based analysis. Once the measurements are obtained, these can be utilized in flow assurance modeling methods to describe asphaltene precipitation kinetics, and continuity of complex reservoirs. For the first time in literature, this study applies these modeling methods in combination with the AOP data acquired from a downhole WFT This approach has the potential to create a step change in reservoir analysis by providing AOP at the sand-face, along with insight that describe performance from asphaltene precipitation. The results of which have tremendous economic implications on production planning.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Jain, Vikas, Muhamad Saiful Hakimi Daud, Milad Saidian y Eric Soza. "A Novel Workflow to Model Grain Size and Capillary Pressure by Integrating Nuclear Magnetic Resonance Well Logs and Core Data: A Case Study for an Offshore Gulf of Mexico Oil Reservoir". En SPE Annual Technical Conference and Exhibition. SPE, 2023. http://dx.doi.org/10.2118/214797-ms.

Texto completo
Resumen
Abstract Subsurface properties derived from core testing are generally considered to be ground truth during the life cycle of field exploitation. The core acquisition is expensive, and analysis is time-consuming, and as a result, not many wells in a field are cored. Therefore, it is a common practice to integrate core and log data to develop a petrophysical model for uncored wells. But due to the complexities involved in integrating advanced multi-dimensional measurements, the value of the log data such as nuclear magnetic resonance (NMR) is not fully realized. In this paper, we will present a novel automated workflow to calibrate NMR well logs with MICP and Grain Size Distribution (GSD) data and apply calibration blindly to NMR logs in wells without core data to validate the models. First, NMR factor analysis and fluid substitution are performed on NMR well logs to 1) remove any hydrocarbon effect to simulate 100% water-bearing T2 distributions, 2) decompose T2 distributions into poro-factors representing underlying pore size distributions, and 3) determine NMR-specific textural facies over the depth intervals of wells with and without MICP and GSD data. Mean poro-factors from 100% water-bearing T2 distributions are then automatically calibrated against mean pore throat size distributions from MICP and mean grain size distributions in every facies for wells where such core data are available. Calibration models then can be easily applied across all wells with common textural facies derived using only NMR data. These workflows are applied to an oil reservoir in the Gulf of Mexico. This oil field is located in the southern Green Canyon protraction area of the Gulf of Mexico, in more than 4,500 feet of water. The main reservoir sandstones are early Miocene in age and were deposited by turbidity currents in a basin floor fan slope to an abyssal plain setting. These deposits overlaid autochthonous salt which eventually evacuated, leading to a structurally complex salt-cored anticline with four-way dip closure. The workflow, including calibration and application, provides consistent and continuous results across wells for both MICP and GSD using NMR T2 distributions. It includes MICP equivalent capillary pressures, pore-throat-size distributions, water saturation, Swanson permeability, entry pressure and irreducible pressure. Saturation height functions are derived by facies to incorporate into reservoir models. It also includes equivalent grain size distributions, Wentworth grain size volumes such as clay, silt, and sand, the distribution numbers (D10, D50, and D90), sorting coefficient, uniformity coefficient, and fines volume. An independent GSD model was also developed using routine core analysis (RCA) data to establish a relationship between porosity, permeability, and mean grain size. Both the NMR GSD model and the RCA GSD model demonstrated consistent results with the available core data in validation datasets. A proposed everyday workflow automates the NMR MICP and GSD core-log integration process to quickly propagate expensive core knowledge across all wells in a field. Calibration ensures that no single or variable scale factor is assumed across the range of T2, but rather facies-wise models are completely data-driven. Also, the workflow eliminates the need to accurately depth match core and log data. Calibration models can be stored, shared among experts, and can be applied post-logging or in real-time for quick decision-making related to reservoir engineering, completions, and production.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Abrar, Saad Bin, Syed Shariq Ali Hashmi, Muhammad Sarmad, Majid Najeeb Siddiqui, Tariq Aziz y Naeem Sardar. "Unlocking Recovery Potential in a Mature Clastic Oil-Rim Reservoir Using an Integrated Reservoir Simulation Model: A Case Study of the Eastern Potwar Basin, Pakistan". En SPE/PAPG Pakistan Section Annual Technical Symposium and Exhibition. SPE, 2023. http://dx.doi.org/10.2118/219497-ms.

Texto completo
Resumen
Abstract This paper presents a case study of Field Sigma, a large gas condensate field in Pakistan. Sigma comprises two primary sandstone reservoir formations, X and Y, where an oil rim was discovered during the late field life. A re-development plan utilizing an integrated reservoir simulation model was formulated to enhance oil production from the oil rim beneath a gas cap. The key challenges faced included reservoir pressure depletion, reservoir heterogeneity, complex fault geometry, and fluid contact uncertainty. The study aims to introduce a novel concept for optimizing oil production from the oil rim reservoir through an integrated reservoir study. In this study, an independent seismic interpretation was conducted on the most recent 3D seismic data, focusing on mapping the major faults that impact the static/dynamic reservoir model and evaluating the transmissibility of the mapped faults iteratively during the history-matching phase. A comprehensive petrophysical study was conducted to calculate various petrophysical parameters on a field scale, providing an updated and consistent analysis that minimized uncertainties from the previous interpretations. Field Sigma has a complex anticline structure with intricate fault geometries and thrust sections. Thus, the static model's structural grid was created using the Volume-Based Modelling (VBM) method, chosen over Corner Point Gridding to represent the complex nature of the field's structure accurately. The reservoir engineering data was thoroughly analyzed and incorporated into a dynamic simulation model. The history-matched, compositional model was used to generate production forecasts. From the various evaluated well locations, Eight economically favorable infill and appraisal sites were identified, with 5 in the Southeastern compartment and 3 in the sub-thrust area of the Northeastern compartment. To appraise the Free Water Level (FWL), Sigma-5 was proposed as an appraisal well in the Southeasternern compartment and after achieving the appraisal objective, in case of water production, the well had the option to sidetrack in an up-dip location. The infill wells Sigma-6,7 and 9 were proposed between Sigma-3 & 4 to drain the remaining hydrocarbon volumes and were considered comparatively less risky as compared to the other wells. However, due to the limited dynamic data, there exists uncertainty in the reservoir connectivity and there is a possibility of encountering further depleted pressures as compared to the simulated pressures. The appraisal well Sigma-8 is proposed to evaluate the eastern extent of the Southeastern compartment, however, this area has high-depth uncertainty and limited well control. All three wells identified in the Northeastern compartment target the Sub-thrust region and were considered high-risk wells. The paper emphasizes the crucial role of data integration from diverse sources in the re-development of a complex mature oil-rim brownfield through the combination of geological knowledge, reservoir-level petrophysical evaluation, incorporation of core data, production history, reservoir understanding, and critical data acquisition during the infill drilling, the project team gained the confidence to devise and execute successful re-development strategy. The iterative creation of robust static and dynamic models provides a valuable planning resource for future endeavors. The methodology outlined in the paper holds broad applicability to typical field developments, establishing it as a valuable industry practice.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía