Статті в журналах з теми "Computational modeling workflow"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Computational modeling workflow.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Computational modeling workflow".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Deelman, Ewa, Christopher Carothers, Anirban Mandal, Brian Tierney, Jeffrey S. Vetter, Ilya Baldin, Claris Castillo, et al. "PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows." International Journal of High Performance Computing Applications 31, no. 1 (July 27, 2016): 4–18. http://dx.doi.org/10.1177/1094342015594515.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Thus, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation and data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.
2

Ackerman, Aidan, Jonathan Cave, Chien-Yu Lin, and Kyle Stillwell. "Computational modeling for climate change: Simulating and visualizing a resilient landscape architecture design approach." International Journal of Architectural Computing 17, no. 2 (May 16, 2019): 125–47. http://dx.doi.org/10.1177/1478077119849659.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Coastlines are changing, wildfires are raging, cities are getting hotter, and spatial designers are charged with the task of designing to mitigate these unknowns. This research examines computational digital workflows to understand and alleviate the impacts of climate change on urban landscapes. The methodology includes two separate simulation and visualization workflows. The first workflow uses an animated particle fluid simulator in combination with geographic information systems data, Photoshop software, and three-dimensional modeling and animation software to simulate erosion and sedimentation patterns, coastal inundation, and sea level rise. The second workflow integrates building information modeling data, computational fluid dynamics simulators, and parameters from EnergyPlus and Landsat to produce typologies and strategies for mitigating urban heat island effects. The effectiveness of these workflows is demonstrated by inserting design prototypes into modeled environments to visualize their success or failure. The result of these efforts is a suite of workflows which have the potential to vastly improve the efficacy with which architects and landscape architects use existing data to address the urgency of climate change.
3

Bicer, Tekin, Dogˇa Gürsoy, Rajkumar Kettimuthu, Francesco De Carlo, and Ian T. Foster. "Optimization of tomographic reconstruction workflows on geographically distributed resources." Journal of Synchrotron Radiation 23, no. 4 (June 15, 2016): 997–1005. http://dx.doi.org/10.1107/s1600577516007980.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i)data transferbetween storage and computational resources, (i)wait/queuetime of reconstruction jobs at compute resources, and (iii)computationof reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizesGlobusto perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.
4

Cuda, G., P. Veltri, and M. Cannataro. "Modeling and Designing a Proteomics Application on PROTEUS." Methods of Information in Medicine 44, no. 02 (2005): 221–26. http://dx.doi.org/10.1055/s-0038-1633951.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Summary Objectives: Biomedical applications, such as analysis and management of mass spectrometry proteomics experiments, involve heterogeneous platforms and knowledge, massive data sets, and complex algorithms. Main requirements of such applications are semantic modeling of the experiments and data analysis, as well as high performance computational platforms. In this paper we propose a software platform allowing to model and execute biomedical applications on the Grid. Methods: Computational Grids offer the required computational power, whereas ontologies and workflow help to face the heterogeneity of biomedical applications. In this paper we propose the use of domain ontologies and workflow techniques for modeling biomedical applications, whereas Grid middleware is responsible for high performance execution. As a case study, the modeling of a proteomics experiment is discussed. Results: The main result is the design and first use of PROTEUS, a Grid-based problem-solving environment for biomedical and bioinformatics applications. Conclusion: To manage the complexity of biomedical experiments, ontologies help to model applications and to identify appropriate data and algorithms, workflow techniques allow to combine the elements of such applications in a systematic way. Finally, translation of workflow into execution plans allows the exploitation of the computational power of Grids. Along this direction, in this paper we present PROTEUS discussing a real case study in the proteomics domain.
5

Vu, Phuong Thanh, Chuen-Fa Ni, Wei-Ci Li, I.-Hsien Lee, and Chi-Ping Lin. "Particle-Based Workflow for Modeling Uncertainty of Reactive Transport in 3D Discrete Fracture Networks." Water 11, no. 12 (November 27, 2019): 2502. http://dx.doi.org/10.3390/w11122502.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Fractures are major flow paths for solute transport in fractured rocks. Conducting numerical simulations of reactive transport in fractured rocks is a challenging task because of complex fracture connections and the associated nonuniform flows and chemical reactions. The study presents a computational workflow that can approximately simulate flow and reactive transport in complex fractured media. The workflow involves a series of computational processes. Specifically, the workflow employs a simple particle tracking (PT) algorithm to track flow paths in complex 3D discrete fracture networks (DFNs). The PHREEQC chemical reaction model is then used to simulate the reactive transport along particle traces. The study illustrates the developed workflow with three numerical examples, including a case with a simple fracture connection and two cases with a complex fracture network system. Results show that the integration processes in the workflow successfully model the tetrachloroethylene (PCE) and trichloroethylene (TCE) degradation and transport along particle traces in complex DFNs. The statistics of concentration along particle traces enables the estimations of uncertainty induced by the fracture structures in DFNs. The types of source contaminants can lead to slight variations of particle traces and influence the long term reactive transport. The concentration uncertainty can propagate from parent to daughter compounds and accumulate along with the transport processes.
6

Subramanian, Govindan, and Shashidhar N. Rao. "An integrated computational workflow for efficient and quantitative modeling of renin inhibitors." Bioorganic & Medicinal Chemistry 20, no. 2 (January 2012): 851–58. http://dx.doi.org/10.1016/j.bmc.2011.11.063.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Et. al., R. Divya Mounika,. "A Benchmarking application on Workload and Performance forecasting of micro services." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 2 (April 10, 2021): 3232–38. http://dx.doi.org/10.17762/turcomat.v12i2.2381.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Micro services are increasingly understood as the ideal architectural framework for building large cloud applications within and beyond organizational boundaries. These micro services architectures scale up the application, but are expensive to work on, so pay attention to workflow planning and workflow planning. However, this issue is not very clear. In this work, we are developing independent micro services workflows suitable for modeling and prediction methods and designing three-step game models for based applications. Solved the problem of designing micro services based applications to reduce end-to-end delays under user-specific limitations (MAWS-BC) and recommended micro services routing algorithms. The design process and estimation methods are improved and adequate. The experimental results produced by a well-known micro service bank cover a wide variety of statistical analyzes and the production utility of graphic design is shown by a large comparison copy compared to current algorithms.
8

Schoder, Stefan, Clemens Junger, and Manfred Kaltenbacher. "Computational aeroacoustics of the EAA benchmark case of an axial fan." Acta Acustica 4, no. 5 (2020): 22. http://dx.doi.org/10.1051/aacus/2020021.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This contribution benchmarks the aeroacoustic workflow of the perturbed convective wave equation and the Ffowcs Williams and Hawkings analogy in Farassat’s 1A version for a low-pressure axial fan. Thereby, we focus on the turbulence modeling of the flow simulation and mesh convergence concerning the complete aeroacoustic workflow. During the validation, good agreement has been found with the efficiency, the wall pressure sensor signals, and the mean velocity profiles in the duct. The analysis of the source term structures shows a strong correlation to the sound pressure spectrum. Finally, both acoustic sound propagation models are compared to the measured sound field data.
9

Pinomaa, Tatu, Ivan Yashchuk, Matti Lindroos, Tom Andersson, Nikolas Provatas, and Anssi Laukkanen. "Process-Structure-Properties-Performance Modeling for Selective Laser Melting." Metals 9, no. 11 (October 24, 2019): 1138. http://dx.doi.org/10.3390/met9111138.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Selective laser melting (SLM) is a promising manufacturing technique where the part design, from performance and properties process control and alloying, can be accelerated with integrated computational materials engineering (ICME). This paper demonstrates a process-structure-properties-performance modeling framework for SLM. For powder-bed scale melt pool modeling, we present a diffuse-interface multiphase computational fluid dynamics model which couples Navier–Stokes, Cahn–Hilliard, and heat-transfer equations. A computationally efficient large-scale heat-transfer model is used to describe the temperature evolution in larger volumes. Phase field modeling is used to demonstrate how epitaxial growth of Ti-6-4 can be interrupted with inoculants to obtain an equiaxed polycrystalline structure. These structures are enriched with a synthetic lath martensite substructure, and their micromechanical response are investigated with a crystal plasticity model. The fatigue performance of these structures are analyzed, with spherical porelike defects and high-aspect-ratio cracklike defects incorporated, and a cycle-amplitude fatigue graph is produced to quantify the fatigue behavior of the structures. The simulated fatigue life presents trends consistent with the literature in terms of high cycle and low cycle fatigue, and the role of defects in dominating the respective performance of the produced SLM structures. The proposed ICME workflow emphasizes the possibilities arising from the vast design space exploitable with respect to manufacturing systems, powders, respective alloy chemistries, and microstructures. By digitalizing the whole workflow and enabling a thorough and detailed virtual evaluation of the causal relationships, the promise of product-targeted materials and solutions for metal additive manufacturing becomes closer to practical engineering application.
10

Schreier, Franz, Sebastián Gimeno García, Philipp Hochstaffl, and Steffen Städt. "Py4CAtS—PYthon for Computational ATmospheric Spectroscopy." Atmosphere 10, no. 5 (May 10, 2019): 262. http://dx.doi.org/10.3390/atmos10050262.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Radiation is a key process in the atmosphere. Numerous radiative transfer codes have been developed spanning a large range of wavelengths, complexities, speeds, and accuracies. In the infrared and microwave, line-by-line codes are crucial esp. for modeling and analyzing high-resolution spectroscopic observations. Here we present Py4CAtS—PYthon scripts for Computational ATmospheric Spectroscopy, a Python re-implemen-tation of the Fortran Generic Atmospheric Radiation Line-by-line Code GARLIC, where computationally-intensive code sections use the Numeric/Scientific Python modules for highly optimized array processing. The individual steps of an infrared or microwave radiative transfer computation are implemented in separate scripts (and corresponding functions) to extract lines of relevant molecules in the spectral range of interest, to compute line-by-line cross sections for given pressure(s) and temperature(s), to combine cross sections to absorption coefficients and optical depths, and to integrate along the line-of-sight to transmission and radiance/intensity. Py4CAtS can be used in three ways: in the (Unix/Windows/Mac) console/terminal, inside the (I)Python interpreter, or Jupyter notebook. The basic design of the package, numerical and computational aspects relevant for optimization, and a sketch of the typical workflow are presented. In conclusion, Py4CAtS provides a versatile environment for “interactive” (and batch) line-by-line radiative transfer modeling.
11

Dao, Tien Tuan. "Advanced computational workflow for the multi-scale modeling of the bone metabolic processes." Medical & Biological Engineering & Computing 55, no. 6 (September 16, 2016): 923–33. http://dx.doi.org/10.1007/s11517-016-1572-z.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Wrede, Fredrik, and Andreas Hellander. "Smart computational exploration of stochastic gene regulatory network models using human-in-the-loop semi-supervised learning." Bioinformatics 35, no. 24 (May 29, 2019): 5199–206. http://dx.doi.org/10.1093/bioinformatics/btz420.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract Motivation Discrete stochastic models of gene regulatory network models are indispensable tools for biological inquiry since they allow the modeler to predict how molecular interactions give rise to nonlinear system output. Model exploration with the objective of generating qualitative hypotheses about the workings of a pathway is usually the first step in the modeling process. It involves simulating the gene network model under a very large range of conditions, due to the large uncertainty in interactions and kinetic parameters. This makes model exploration highly computational demanding. Furthermore, with no prior information about the model behavior, labor-intensive manual inspection of very large amounts of simulation results becomes necessary. This limits systematic computational exploration to simplistic models. Results We have developed an interactive, smart workflow for model exploration based on semi-supervised learning and human-in-the-loop labeling of data. The workflow lets a modeler rapidly discover ranges of interesting behaviors predicted by the model. Utilizing that similar simulation output is in proximity of each other in a feature space, the modeler can focus on informing the system about what behaviors are more interesting than others by labeling, rather than analyzing simulation results with custom scripts and workflows. This results in a large reduction in time-consuming manual work by the modeler early in a modeling project, which can substantially reduce the time needed to go from an initial model to testable predictions and downstream analysis. Availability and implementation A python-package is available at https://github.com/Wrede/mio.git. Supplementary information Supplementary data are available at Bioinformatics online.
13

Fischer, T., D. Naumov, S. Sattler, O. Kolditz, and M. Walther. "GO2OGS 1.0: a versatile workflow to integrate complex geological information with fault data into numerical simulation models." Geoscientific Model Development 8, no. 11 (November 12, 2015): 3681–94. http://dx.doi.org/10.5194/gmd-8-3681-2015.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract. We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.
14

Antunes, Dinler A., Jayvee R. Abella, Sarah Hall-Swan, Didier Devaurs, Anja Conev, Mark Moll, Gregory Lizée, and Lydia E. Kavraki. "HLA-Arena: A Customizable Environment for the Structural Modeling and Analysis of Peptide-HLA Complexes for Cancer Immunotherapy." JCO Clinical Cancer Informatics, no. 4 (September 2020): 623–36. http://dx.doi.org/10.1200/cci.19.00123.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
PURPOSE HLA protein receptors play a key role in cellular immunity. They bind intracellular peptides and display them for recognition by T-cell lymphocytes. Because T-cell activation is partially driven by structural features of these peptide-HLA complexes, their structural modeling and analysis are becoming central components of cancer immunotherapy projects. Unfortunately, this kind of analysis is limited by the small number of experimentally determined structures of peptide-HLA complexes. Overcoming this limitation requires developing novel computational methods to model and analyze peptide-HLA structures. METHODS Here we describe a new platform for the structural modeling and analysis of peptide-HLA complexes, called HLA-Arena, which we have implemented using Jupyter Notebook and Docker. It is a customizable environment that facilitates the use of computational tools, such as APE-Gen and DINC, which we have previously applied to peptide-HLA complexes. By integrating other commonly used tools, such as MODELLER and MHCflurry, this environment includes support for diverse tasks in structural modeling, analysis, and visualization. RESULTS To illustrate the capabilities of HLA-Arena, we describe 3 example workflows applied to peptide-HLA complexes. Leveraging the strengths of our tools, DINC and APE-Gen, the first 2 workflows show how to perform geometry prediction for peptide-HLA complexes and structure-based binding prediction, respectively. The third workflow presents an example of large-scale virtual screening of peptides for multiple HLA alleles. CONCLUSION These workflows illustrate the potential benefits of HLA-Arena for the structural modeling and analysis of peptide-HLA complexes. Because HLA-Arena can easily be integrated within larger computational pipelines, we expect its potential impact to vastly increase. For instance, it could be used to conduct structural analyses for personalized cancer immunotherapy, neoantigen discovery, or vaccine development.
15

Kalyuzhnaya, Anna V., Nikolay O. Nikitin, Alexander Hvatov, Mikhail Maslyaev, Mikhail Yachmenkov, and Alexander Boukhanovsky. "Towards Generative Design of Computationally Efficient Mathematical Models with Evolutionary Learning." Entropy 23, no. 1 (December 27, 2020): 28. http://dx.doi.org/10.3390/e23010028.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
In this paper, we describe the concept of generative design approach applied to the automated evolutionary learning of mathematical models in a computationally efficient way. To formalize the problems of models’ design and co-design, the generalized formulation of the modeling workflow is proposed. A parallelized evolutionary learning approach for the identification of model structure is described for the equation-based model and composite machine learning models. Moreover, the involvement of the performance models in the design process is analyzed. A set of experiments with various models and computational resources is conducted to verify different aspects of the proposed approach.
16

Lingerfelt, E. J., A. Belianinov, E. Endeve, O. Ovchinnikov, S. Somnath, J. M. Borreguero, N. Grodowitz, et al. "BEAM: A Computational Workflow System for Managing and Modeling Material Characterization Data in HPC Environments." Procedia Computer Science 80 (2016): 2276–80. http://dx.doi.org/10.1016/j.procs.2016.05.410.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Grzegorzewski, Jan, Janosch Brandhorst, Kathleen Green, Dimitra Eleftheriadou, Yannick Duport, Florian Barthorscht, Adrian Köller, Danny Yu Jia Ke, Sara De Angelis, and Matthias König. "PK-DB: pharmacokinetics database for individualized and stratified computational modeling." Nucleic Acids Research 49, no. D1 (November 5, 2020): D1358—D1364. http://dx.doi.org/10.1093/nar/gkaa990.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract A multitude of pharmacokinetics studies have been published. However, due to the lack of an open database, pharmacokinetics data, as well as the corresponding meta-information, have been difficult to access. We present PK-DB (https://pk-db.com), an open database for pharmacokinetics information from clinical trials. PK-DB provides curated information on (i) characteristics of studied patient cohorts and subjects (e.g. age, bodyweight, smoking status, genetic variants); (ii) applied interventions (e.g. dosing, substance, route of application); (iii) pharmacokinetic parameters (e.g. clearance, half-life, area under the curve) and (iv) measured pharmacokinetic time-courses. Key features are the representation of experimental errors, the normalization of measurement units, annotation of information to biological ontologies, calculation of pharmacokinetic parameters from concentration-time profiles, a workflow for collaborative data curation, strong validation rules on the data, computational access via a REST API as well as human access via a web interface. PK-DB enables meta-analysis based on data from multiple studies and data integration with computational models. A special focus lies on meta-data relevant for individualized and stratified computational modeling with methods like physiologically based pharmacokinetic (PBPK), pharmacokinetic/pharmacodynamic (PK/PD), or population pharmacokinetic (pop PK) modeling.
18

Aimene, Yamina E., and Ahmed Ouenes. "Geomechanical modeling of hydraulic fractures interacting with natural fractures — Validation with microseismic and tracer data from the Marcellus and Eagle Ford." Interpretation 3, no. 3 (August 1, 2015): SU71—SU88. http://dx.doi.org/10.1190/int-2014-0274.1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
We have developed a new geomechanical workflow to study the mechanics of hydraulic fracturing in naturally fractured unconventional reservoirs. This workflow used the material point method (MPM) for computational mechanics and an equivalent fracture model derived from continuous fracture modeling to represent natural fractures (NFs). We first used the workflow to test the effect of different stress anisotropies on the propagation path of a single NF intersected by a hydraulic fracture. In these elementary studies, increasing the stress anisotropy was found to decrease the curving of a propagating NF, and this could be used to explain the observed trends in the microseismic data. The workflow was applied to Marcellus and Eagle Ford wells, where multiple geomechanical results were validated with microseismic data and tracer tests. Application of the workflow to a Marcellus well provides a strain field that correlates well with microseismicity, and a maximum energy release rate, or [Formula: see text] integral at each completion stage, which appeared to correlate to the production log and could be used to quantify the impact of skipping the completion stages. On the first of two Eagle Ford wells considered, the MPM workflow provided a horizontal differential stress map that showed significant variability imparted by NFs perturbing the regional stress field. Additionally, a map of the strain distribution after stimulating the well showed the same features as the interpreted microseismic data: three distinct regions of microseismic character, supported by tracer tests and explained by the MPM differential stress map. Finally, the workflow was able to estimate, in the second well with no microseismic data, its main performance characteristics as validated by tracer tests. The field-validated MPM geomechanical workflow is a powerful tool for completion optimization in the presence of NFs, which affect in multiple ways the final outcome of hydraulic fracturing.
19

Zhang, Wenjuan, Waleed Diab, Hadi Hajibeygi, and Mohammed Al Kobaisi. "A Computational Workflow for Flow and Transport in Fractured Porous Media Based on a Hierarchical Nonlinear Discrete Fracture Modeling Approach." Energies 13, no. 24 (December 17, 2020): 6667. http://dx.doi.org/10.3390/en13246667.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Modeling flow and transport in fractured porous media has been a topic of intensive research for a number of energy- and environment-related industries. The presence of multiscale fractures makes it an extremely challenging task to resolve accurately and efficiently the flow dynamics at both the local and global scales. To tackle this challenge, we developed a computational workflow that adopts a two-level hierarchical strategy based on fracture length partitioning. This was achieved by specifying a partition length to split the discrete fracture network (DFN) into small-scale fractures and large-scale fractures. Flow-based numerical upscaling was then employed to homogenize the small-scale fractures and the porous matrix into an equivalent/effective single medium, whereas the large-scale fractures were modeled explicitly. As the effective medium properties can be fully tensorial, the developed hierarchical framework constructed the discrete systems for the explicit fracture–matrix sub-domains using the nonlinear two-point flux approximation (NTPFA) scheme. This led to a significant reduction of grid orientation effects, thus developing a robust, applicable, and field-relevant framework. To assess the efficacy of the proposed hierarchical workflow, several numerical simulations were carried out to systematically analyze the effects of the homogenized explicit cutoff length scale, as well as the fracture length and orientation distributions. The effect of different boundary conditions, namely, the constant pressure drop boundary condition and the linear pressure boundary condition, for the numerical upscaling on the accuracy of the workflow was investigated. The results show that when the partition length is much larger than the characteristic length of the grid block, and when the DFN has a predominant orientation that is often the case in practical simulations, the workflow employing linear pressure boundary conditions for numerical upscaling give closer results to the full-model reference solutions. Our findings shed new light on the development of meaningful computational frameworks for highly fractured, heterogeneous geological media where fractures are present at multiple scales.
20

Lucidi, A., E. Giordano, F. Clementi, and R. Quattrini. "POINT CLOUD EXPLOITATION FOR STRUCTURAL MODELING AND ANALYSIS: A RELIABLE WORKFLOW." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B2-2021 (June 28, 2021): 891–98. http://dx.doi.org/10.5194/isprs-archives-xliii-b2-2021-891-2021.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract. The digitization and geometric knowledge of the historical built heritage is currently based on point cloud, that rarely or only partially is used as digital twin for structural analysis. The present work deals with historical artefacts survey, with particular reference to masonry structures, aimed to their structural analysis and assessment. In detail, the study proposes a methodology capable of employing semi-directly the original data obtained from the 3D digital survey for the generation of a Finite Element Model (FEM), used for structural analysis of masonry buildings.The methodology described presents a reliable workflow with twofold purpose: the improvement of the transformation process of the point cloud in solid and subsequently obtain a high-quality and detailed model for structural analyses.Through the application of the methodology to a case study, the method consistency was assessed, regarding the smoothness of the whole procedure and the dynamic characterization of the Finite Element Model. The main improvement in respect with similar or our previous workflows is obtained by the introduction of the retopology in data processing, allowing the transformation of the raw data into a solid model with optimal balancing between Level of Detail (LOD) and computational weight. Another significant aspect of the optimized process is undoubtedly the possibility of faithfully respecting the semantics of the structure, leading to the discretization of the model into different parts depending on the materials. This work may represent an excellent reference for the study of masonry artefacts belonging to the existing historical heritage, starting from surveys and with the purpose to structural and seismic evaluations, in the general framework of knowledge-based preservation of heritage.
21

Winkler, Robert. "An evolving computational platform for biological mass spectrometry: workflows, statistics and data mining with MASSyPup64." PeerJ 3 (November 17, 2015): e1401. http://dx.doi.org/10.7717/peerj.1401.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
In biological mass spectrometry, crude instrumental data need to be converted into meaningful theoretical models. Several data processing and data evaluation steps are required to come to the final results. These operations are often difficult to reproduce, because of too specific computing platforms. This effect, known as ‘workflow decay’, can be diminished by using a standardized informatic infrastructure. Thus, we compiled an integrated platform, which contains ready-to-use tools and workflows for mass spectrometry data analysis. Apart from general unit operations, such as peak picking and identification of proteins and metabolites, we put a strong emphasis on the statistical validation of results and Data Mining. MASSyPup64 includes e.g., the OpenMS/TOPPAS framework, the Trans-Proteomic-Pipeline programs, the ProteoWizard tools, X!Tandem, Comet and SpiderMass. The statistical computing language R is installed with packages for MS data analyses, such as XCMS/metaXCMS and MetabR. The R package Rattle provides a user-friendly access to multiple Data Mining methods. Further, we added the non-conventional spreadsheet program teapot for editing large data sets and a command line tool for transposing large matrices. Individual programs, console commands and modules can be integrated using the Workflow Management System (WMS) taverna. We explain the useful combination of the tools by practical examples: (1) A workflow for protein identification and validation, with subsequent Association Analysis of peptides, (2) Cluster analysis and Data Mining in targeted Metabolomics, and (3) Raw data processing, Data Mining and identification of metabolites in untargeted Metabolomics. Association Analyses reveal relationships between variables across different sample sets. We present its application for finding co-occurring peptides, which can be used for target proteomics, the discovery of alternative biomarkers and protein–protein interactions. Data Mining derived models displayed a higher robustness and accuracy for classifying sample groups in targeted Metabolomics than cluster analyses. Random Forest models do not only provide predictive models, which can be deployed for new data sets, but also the variable importance. We demonstrate that the later is especially useful for tracking down significant signals and affected pathways in untargeted Metabolomics. Thus, Random Forest modeling supports the unbiased search for relevant biological features in Metabolomics. Our results clearly manifest the importance of Data Mining methods to disclose non-obvious information in biological mass spectrometry . The application of a Workflow Management System and the integration of all required programs and data in a consistent platform makes the presented data analyses strategies reproducible for non-expert users. The simple remastering process and the Open Source licenses of MASSyPup64 (http://www. bioprocess.org/massypup/) enable the continuous improvement of the system.
22

Bao, Liang, Chase Wu, Xiaoxuan Bu, Nana Ren, and Mengqing Shen. "Performance Modeling and Workflow Scheduling of Microservice-Based Applications in Clouds." IEEE Transactions on Parallel and Distributed Systems 30, no. 9 (September 1, 2019): 2114–29. http://dx.doi.org/10.1109/tpds.2019.2901467.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
23

van Zelst, Sebastiaan J., and Sander J. J. Leemans. "Translating Workflow Nets to Process Trees: An Algorithmic Approach." Algorithms 13, no. 11 (November 2, 2020): 279. http://dx.doi.org/10.3390/a13110279.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Since their introduction, process trees have been frequently used as a process modeling formalism in many process mining algorithms. A process tree is a (mathematical) tree-based model of a process, in which internal vertices represent behavioral control-flow relations and leaves represent process activities. Translation of a process tree into a sound workflow net is trivial. However, the reverse is not the case. Simultaneously, an algorithm that translates a WF-net into a process tree is of great interest, e.g., the explicit knowledge of the control-flow hierarchy in a WF-net allows one to reason on its behavior more easily. Hence, in this paper, we present such an algorithm, i.e., it detects whether a WF-net corresponds to a process tree, and, if so, constructs it. We prove that, if the algorithm finds a process tree, the language of the process tree is equal to the language of the original WF-net. The experiments conducted show that the algorithm’s corresponding implementation has a quadratic time complexity in the size of the WF-net. Furthermore, the experiments show strong evidence of process tree rediscoverability.
24

Busetti, Seth. "A method for modeling multiscale geomechanical effects in the stimulated rock volume." Interpretation 9, no. 1 (January 7, 2021): T45—T61. http://dx.doi.org/10.1190/int-2020-0090.1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
I have developed a workflow to efficiently simulate geomechanical effects in the stimulated rock volume (SRV) by including regional geologic structures such as faults and folds as well as high-resolution-oriented mechanical stratigraphy. The motivation is that the local model used for hydraulic fracture analysis should include macroscale 3D geomechanical effects derived from regional tectonic and seismic data. A practical computational strategy is developed to link multiple 3D geomechanical models derived at different scales and their associated stress effects. I apply the workflow to a synthetic reservoir problem composed of a tectonic-scale structural framework model with three embedded mechanical stratigraphic models representing three stimulated vertical wells. I first combine regional stresses solved with 3D finite-element analysis with perturbation stresses from elastic dislocation modeling using elastic superposition concepts. I then apply the macroscale stress effects as unique boundary conditions to an embedded finite-element submodel, a mesoscale stratigraphic model representing the SRV allowing for resolution of variable stress amplification, and stress rotation in geologic sublayers. Finally, I conduct hydraulic-fracture simulations within the SRV models. The simulated hydraulic fractures are controled by the structural position and mechanical stratigraphy. Closest to the back limb of the main structural anticline, hydraulic fractures tend to be height-restricted, and in some realizations, fractures propagate horizontally. Adjacent to the fold and a fault, where differential stresses are elevated, fracture growth is the most unconstrained in height and length. Results suggest that this multiscale approach can be applied to better predict and understand behaviors related to unconventional reservoir stimulation. The workflow could easily be modified for other operational problems and geologic settings.
25

Mahadevan, Vijay S., Iulian Grindeanu, Robert Jacob, and Jason Sarich. "Improving climate model coupling through a complete mesh representation: a case study with E3SM (v1) and MOAB (v5.x)." Geoscientific Model Development 13, no. 5 (May 26, 2020): 2355–77. http://dx.doi.org/10.5194/gmd-13-2355-2020.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract. One of the fundamental factors contributing to the spatiotemporal inaccuracy in climate modeling is the mapping of solution field data between different discretizations and numerical grids used in the coupled component models. The typical climate computational workflow involves evaluation and serialization of the remapping weights during the preprocessing step, which is then consumed by the coupled driver infrastructure during simulation to compute field projections. Tools like Earth System Modeling Framework (ESMF) (Hill et al., 2004) and TempestRemap (Ullrich et al., 2013) offer capability to generate conservative remapping weights, while the Model Coupling Toolkit (MCT) (Larson et al., 2001) that is utilized in many production climate models exposes functionality to make use of the operators to solve the coupled problem. However, such multistep processes present several hurdles in terms of the scientific workflow and impede research productivity. In order to overcome these limitations, we present a fully integrated infrastructure based on the Mesh Oriented datABase (MOAB) (Tautges et al., 2004; Mahadevan et al., 2015) library, which allows for a complete description of the numerical grids and solution data used in each submodel. Through a scalable advancing-front intersection algorithm, the supermesh of the source and target grids are computed, which is then used to assemble the high-order, conservative, and monotonicity-preserving remapping weights between discretization specifications. The Fortran-compatible interfaces in MOAB are utilized to directly link the submodels in the Energy Exascale Earth System Model (E3SM) to enable online remapping strategies in order to simplify the coupled workflow process. We demonstrate the superior computational efficiency of the remapping algorithms in comparison with other state-of-the-science tools and present strong scaling results on large-scale machines for computing remapping weights between the spectral element atmosphere and finite volume discretizations on the polygonal ocean grids.
26

Zielinski, Daniel Craig, Arjun Patel, and Bernhard O. Palsson. "The Expanding Computational Toolbox for Engineering Microbial Phenotypes at the Genome Scale." Microorganisms 8, no. 12 (December 21, 2020): 2050. http://dx.doi.org/10.3390/microorganisms8122050.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Microbial strains are being engineered for an increasingly diverse array of applications, from chemical production to human health. While traditional engineering disciplines are driven by predictive design tools, these tools have been difficult to build for biological design due to the complexity of biological systems and many unknowns of their quantitative behavior. However, due to many recent advances, the gap between design in biology and other engineering fields is closing. In this work, we discuss promising areas of development of computational tools for engineering microbial strains. We define five frontiers of active research: (1) Constraint-based modeling and metabolic network reconstruction, (2) Kinetics and thermodynamic modeling, (3) Protein structure analysis, (4) Genome sequence analysis, and (5) Regulatory network analysis. Experimental and machine learning drivers have enabled these methods to improve by leaps and bounds in both scope and accuracy. Modern strain design projects will require these tools to be comprehensively applied to the entire cell and efficiently integrated within a single workflow. We expect that these frontiers, enabled by the ongoing revolution of big data science, will drive forward more advanced and powerful strain engineering strategies.
27

Wang, Ning, Diane Lefaudeux, Anup Mazumder, Jingyi Jessica Li, and Alexander Hoffmann. "Identifying the combinatorial control of signal-dependent transcription factors." PLOS Computational Biology 17, no. 6 (June 24, 2021): e1009095. http://dx.doi.org/10.1371/journal.pcbi.1009095.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The effectiveness of immune responses depends on the precision of stimulus-responsive gene expression programs. Cells specify which genes to express by activating stimulus-specific combinations of stimulus-induced transcription factors (TFs). Their activities are decoded by a gene regulatory strategy (GRS) associated with each response gene. Here, we examined whether the GRSs of target genes may be inferred from stimulus-response (input-output) datasets, which remains an unresolved model-identifiability challenge. We developed a mechanistic modeling framework and computational workflow to determine the identifiability of all possible combinations of synergistic (AND) or non-synergistic (OR) GRSs involving three transcription factors. Considering different sets of perturbations for stimulus-response studies, we found that two thirds of GRSs are easily distinguishable but that substantially more quantitative data is required to distinguish the remaining third. To enhance the accuracy of the inference with timecourse experimental data, we developed an advanced error model that avoids error overestimates by distinguishing between value and temporal error. Incorporating this error model into a Bayesian framework, we show that GRS models can be identified for individual genes by considering multiple datasets. Our analysis rationalizes the allocation of experimental resources by identifying most informative TF stimulation conditions. Applying this computational workflow to experimental data of immune response genes in macrophages, we found that a much greater fraction of genes are combinatorially controlled than previously reported by considering compensation among transcription factors. Specifically, we revealed that a group of known NFκB target genes may also be regulated by IRF3, which is supported by chromatin immuno-precipitation analysis. Our study provides a computational workflow for designing and interpreting stimulus-response gene expression studies to identify underlying gene regulatory strategies and further a mechanistic understanding.
28

Zhou, Jia, Hany Abdel-Khalik, Paul Talbot, and Cristian Rabiti. "A Hybrid Energy System Workflow for Energy Portfolio Optimization." Energies 14, no. 15 (July 21, 2021): 4392. http://dx.doi.org/10.3390/en14154392.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This manuscript develops a workflow, driven by data analytics algorithms, to support the optimization of the economic performance of an Integrated Energy System. The goal is to determine the optimum mix of capacities from a set of different energy producers (e.g., nuclear, gas, wind and solar). A stochastic-based optimizer is employed, based on Gaussian Process Modeling, which requires numerous samples for its training. Each sample represents a time series describing the demand, load, or other operational and economic profiles for various types of energy producers. These samples are synthetically generated using a reduced order modeling algorithm that reads a limited set of historical data, such as demand and load data from past years. Numerous data analysis methods are employed to construct the reduced order models, including, for example, the Auto Regressive Moving Average, Fourier series decomposition, and the peak detection algorithm. All these algorithms are designed to detrend the data and extract features that can be employed to generate synthetic time histories that preserve the statistical properties of the original limited historical data. The optimization cost function is based on an economic model that assesses the effective cost of energy based on two figures of merit: the specific cash flow stream for each energy producer and the total Net Present Value. An initial guess for the optimal capacities is obtained using the screening curve method. The results of the Gaussian Process model-based optimization are assessed using an exhaustive Monte Carlo search, with the results indicating reasonable optimization results. The workflow has been implemented inside the Idaho National Laboratory’s Risk Analysis and Virtual Environment (RAVEN) framework. The main contribution of this study addresses several challenges in the current optimization methods of the energy portfolios in IES: First, the feasibility of generating the synthetic time series of the periodic peak data; Second, the computational burden of the conventional stochastic optimization of the energy portfolio, associated with the need for repeated executions of system models; Third, the inadequacies of previous studies in terms of the comparisons of the impact of the economic parameters. The proposed workflow can provide a scientifically defendable strategy to support decision-making in the electricity market and to help energy distributors develop a better understanding of the performance of integrated energy systems.
29

Haiman, Zachary B., Daniel C. Zielinski, Yuko Koike, James T. Yurkovich, and Bernhard O. Palsson. "MASSpy: Building, simulating, and visualizing dynamic biological models in Python using mass action kinetics." PLOS Computational Biology 17, no. 1 (January 28, 2021): e1008208. http://dx.doi.org/10.1371/journal.pcbi.1008208.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Mathematical models of metabolic networks utilize simulation to study system-level mechanisms and functions. Various approaches have been used to model the steady state behavior of metabolic networks using genome-scale reconstructions, but formulating dynamic models from such reconstructions continues to be a key challenge. Here, we present the Mass Action Stoichiometric Simulation Python (MASSpy) package, an open-source computational framework for dynamic modeling of metabolism. MASSpy utilizes mass action kinetics and detailed chemical mechanisms to build dynamic models of complex biological processes. MASSpy adds dynamic modeling tools to the COnstraint-Based Reconstruction and Analysis Python (COBRApy) package to provide an unified framework for constraint-based and kinetic modeling of metabolic networks. MASSpy supports high-performance dynamic simulation through its implementation of libRoadRunner: the Systems Biology Markup Language (SBML) simulation engine. Three examples are provided to demonstrate how to use MASSpy: (1) a validation of the MASSpy modeling tool through dynamic simulation of detailed mechanisms of enzyme regulation; (2) a feature demonstration using a workflow for generating ensemble of kinetic models using Monte Carlo sampling to approximate missing numerical values of parameters and to quantify biological uncertainty, and (3) a case study in which MASSpy is utilized to overcome issues that arise when integrating experimental data with the computation of functional states of detailed biological mechanisms. MASSpy represents a powerful tool to address challenges that arise in dynamic modeling of metabolic networks, both at small and large scales.
30

Berg, Philipp, Sylvia Saalfeld, Samuel Voß, Oliver Beuing, and Gábor Janiga. "A review on the reliability of hemodynamic modeling in intracranial aneurysms: why computational fluid dynamics alone cannot solve the equation." Neurosurgical Focus 47, no. 1 (July 2019): E15. http://dx.doi.org/10.3171/2019.4.focus19181.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Computational blood flow modeling in intracranial aneurysms (IAs) has enormous potential for the assessment of highly resolved hemodynamics and derived wall stresses. This results in an improved knowledge in important research fields, such as rupture risk assessment and treatment optimization. However, due to the requirement of assumptions and simplifications, its applicability in a clinical context remains limited.This review article focuses on the main aspects along the interdisciplinary modeling chain and highlights the circumstance that computational fluid dynamics (CFD) simulations are embedded in a multiprocess workflow. These aspects include imaging-related steps, the setup of realistic hemodynamic simulations, and the analysis of multidimensional computational results. To condense the broad knowledge, specific recommendations are provided at the end of each subsection.Overall, various individual substudies exist in the literature that have evaluated relevant technical aspects. In this regard, the importance of precise vessel segmentations for the simulation outcome is emphasized. Furthermore, the accuracy of the computational model strongly depends on the specific research question. Additionally, standardization in the context of flow analysis is required to enable an objective comparison of research findings and to avoid confusion within the medical community. Finally, uncertainty quantification and validation studies should always accompany numerical investigations.In conclusion, this review aims for an improved awareness among physicians regarding potential sources of error in hemodynamic modeling for IAs. Although CFD is a powerful methodology, it cannot provide reliable information, if pre- and postsimulation steps are inaccurately carried out. From this, future studies can be critically evaluated and real benefits can be differentiated from results that have been acquired based on technically inaccurate procedures.
31

Yan, Jiayi, Karen Kensek, Kyle Konis, and Douglas Noble. "CFD Visualization in a Virtual Reality Environment Using Building Information Modeling Tools." Buildings 10, no. 12 (December 4, 2020): 229. http://dx.doi.org/10.3390/buildings10120229.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Scientific visualization has been an essential process in the engineering field, enabling the tracking of large-scale simulation data and providing intuitive and comprehendible graphs and models that display useful data. For computational fluid dynamics (CFD) data, the need for scientific visualization is even more important given the complicated spatial data structure and large quantities of data points characteristic of CFD data. To better take advantage of CFD results for buildings, the potential use of virtual reality (VR) techniques cannot be overlooked in the development of building projects. However, the workflow required to bring CFD simulation results to VR has not been streamlined. Building information modeling (BIM) as a lifecycle tool for buildings includes as much information as possible for further applications. To this end, this study brings CFD visualization to VR using BIM tools and reports the evaluation and analysis of the results.
32

Gerstner, Wulfram, Henning Sprekeler, and Gustavo Deco. "Theory and Simulation in Neuroscience." Science 338, no. 6103 (October 4, 2012): 60–65. http://dx.doi.org/10.1126/science.1227356.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Modeling work in neuroscience can be classified using two different criteria. The first one is the complexity of the model, ranging from simplified conceptual models that are amenable to mathematical analysis to detailed models that require simulations in order to understand their properties. The second criterion is that of direction of workflow, which can be from microscopic to macroscopic scales (bottom-up) or from behavioral target functions to properties of components (top-down). We review the interaction of theory and simulation using examples of top-down and bottom-up studies and point to some current developments in the fields of computational and theoretical neuroscience.
33

Cogle, Christopher R., Taher Abbasi, Neeraj Kumar Singh, Mohammed Sauban, Rahul K. Raman, Robinson Vidva, Anuj Tyagi, et al. "AraC-Daunorubicin-Etoposide (ADE) Response Prediction in Pediatric AML Patients Using a Computational Biology Modeling (CBM) Based Precision Medicine Workflow." Blood 132, Supplement 1 (November 29, 2018): 4034. http://dx.doi.org/10.1182/blood-2018-99-115775.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract Background: Pediatric AML (pAML) treatment outcomes can vary due to genomic heterogeneity. Thus, selecting the right drugs for a given patient is challenging. There is a need for a priori means of predicting treatment responses based on tumor "omics". Computational biology modeling (CBM) is a precision medicine approach by which biological pathways of tumorigenesis are mapped using mathematical principles to yield a virtual, interactive tumor model. This model can be customized based on a patient's omics and analyzed virtually for response to therapies. Aim: To define prediction values of a CBM precision medicine approach in matching clinical response to ADE therapy in a cohort of pAML patients. Methods: Thirty pAML patients that were treated ADE chemotherapy were utilized with information on the clinical, genomic (cytogenetics, mutations) and protein expression data from this cohort of pAML patients used for the CBM. From cytogenetics results, gene copy number variations were coded as either knocked-down (KD) or over-expressed (OE). From NGS results (2 gene panel - CEBPA, NPM1), gene mutations were coded as either loss or gain of function (LOF or GOF). For protein expression data, proteins that were >2sigma from the mean were coded as KD if their value was <0 or OE if their value was >0. Proteins with values <2sigma from the mean were not included in the CBM as perturbed. The LOF, GOF, KD, OE data was input in the CBM software system (Cellworks Group) to generate patient-specific maps of AML. Each map showed unique interplay of dysregulated networks for the patient's AML. Digital drug simulations were then conducted in each map to measure the impact of cytarabine, daunorubicin and etoposide alone and in combination to predict AML disease inhibition score (DIS) (composite of cell proliferation, viability, apoptosis and impact on patient-specific biomarkers). Response to treatment is determined based on a threshold DIS range derived through AML training datasets comprising omics and clinical outcome. Clinical outcome data for these pAML patients treated with ADE was compared with CBM predictions. Clinical response was defined as complete response at the end of consolidation therapy as per International Working Group 2006 criteria. Results: Assessment was made for 30 patients, 14 female, median age 14 years, all of which achieved CR, with predictions made for all but one which lacked sufficient genomic inputs. CBM accurately predicted the clinical outcomes of 28 of 29 responders, with an accuracy and positive predictive value of 96 %. Multivariate analysis of predictive score with age at diagnosis, DFS and OS is positively correlated with Pearson coefficient of 0.22, 0.54 and 0.49. Analysis of the individual drug responses of each patient indicated that some of the drugs were predicted to be non-responsive based on the patient-disease pathway characteristics, and could have been eliminated from the treatment, thus reducing the overall adverse impact of the very intensive therapy regimen. There were profiles in which AraC was a responder due to decreased mismatch repair pathway in the disease network resulting from presence of aberrations such as KMT2A-AFDN, RUNX1-RUNX1T1, CEBPA LOF, KDM1A OE, MSH2 KD etc., while Daunorubicin and Etoposide were predicted as non-responders due to presence of an intact homologous DNA repair pathway as a result of absence of aberrations in HR pathway genes. CBM analysis of patient "omics" driven disease characteristics could have eliminated additional drugs for such patients. Conclusion: The CBM prediction of ADE in pAML patients based on genomic, proteomic and clinical data showed high predictive accuracy of 96.55%. CBM analysis of patients' genomics and proteomics driven disease characteristics and individual drug response prediction indicated that the intensive therapy regimen can be tailored for each patient to minimize toxicity by removing non-responsive drugs. The study validates the approach to a priori predict response and identify optimal therapy option for the patient. Disclosures Cogle: Celgene: Other: Steering Committee Member of Connect MDS/AML Registry. Abbasi:Cell Works Group Inc.: Employment. Singh:Cellworks Research India Private Limited: Employment. Sauban:Cellworks Research India Private Limited: Employment. Raman:Cellworks Research India Private Limited: Employment. Vidva:Cellworks Research India Private Limited: Employment. Tyagi:Cellworks Research India Private Limited: Employment. Talawdekar:Cellworks Research India Private Limited: Employment. Das:Cellworks Research India Private Limited: Employment. Vali:Cell Works Group Inc.: Employment.
34

van der Hooft, Justin Johan Jozias, Joe Wandy, Michael P. Barrett, Karl E. V. Burgess, and Simon Rogers. "Topic modeling for untargeted substructure exploration in metabolomics." Proceedings of the National Academy of Sciences 113, no. 48 (November 16, 2016): 13738–43. http://dx.doi.org/10.1073/pnas.1608041113.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The potential of untargeted metabolomics to answer important questions across the life sciences is hindered because of a paucity of computational tools that enable extraction of key biochemically relevant information. Available tools focus on using mass spectrometry fragmentation spectra to identify molecules whose behavior suggests they are relevant to the system under study. Unfortunately, fragmentation spectra cannot identify molecules in isolation but require authentic standards or databases of known fragmented molecules. Fragmentation spectra are, however, replete with information pertaining to the biochemical processes present, much of which is currently neglected. Here, we present an analytical workflow that exploits all fragmentation data from a given experiment to extract biochemically relevant features in an unsupervised manner. We demonstrate that an algorithm originally used for text mining, latent Dirichlet allocation, can be adapted to handle metabolomics datasets. Our approach extracts biochemically relevant molecular substructures (“Mass2Motifs”) from spectra as sets of co-occurring molecular fragments and neutral losses. The analysis allows us to isolate molecular substructures, whose presence allows molecules to be grouped based on shared substructures regardless of classical spectral similarity. These substructures, in turn, support putative de novo structural annotation of molecules. Combining this spectral connectivity to orthogonal correlations (e.g., common abundance changes under system perturbation) significantly enhances our ability to provide mechanistic explanations for biological behavior.
35

Aboushady, Dina, Maria Kristina Parr, and Rasha S. Hanafi. "Quality-by-Design Is a Tool for Quality Assurance in the Assessment of Enantioseparation of a Model Active Pharmaceutical Ingredient." Pharmaceuticals 13, no. 11 (November 4, 2020): 364. http://dx.doi.org/10.3390/ph13110364.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The design of experiments (DoE) is one of the quality-by-design tools valued in analytical method development, not only for cost reduction and time effectiveness, but also for enabling analytical method control and understanding via a systematic workflow, leading to analytical methods with built-in quality. This work aimed at using DoE to enhance method understanding for a developed UHPLC enantioseparation of terbutaline (TER), a model chiral drug, and to define quality assurance parameters associated with using chiral mobile phase additives (CMPA). Within a response surface methodology workflow, the effect of different factors on both chiral resolution and retention was screened and optimized using Plackett-Burman and central composite designs, respectively, followed by multivariate mathematical modeling. This study was able to delimit method robustness and elucidate enantiorecognition mechanisms involved in interactions of TER with the chiral modifiers. Among many CMPAs, successful TER enantioresolution was achieved using hydroxypropyl β-cyclodextrin (HP-β-CD) added to the mobile phase as 5.4 mM HP-β-CD in 52.25 mM ammonium acetate. Yet, limited method robustness was observed upon switching between the different tested CMPA, concluding that quality can only be assured with specific minimal pre-run conditioning time with the CMPA, namely 16-column volume (60 min at 0.1 mL/min). For enantiorecognition understanding, computational molecular modeling revealed hydrogen bonding as the main binding interaction, in addition to dipole-dipole inside the CD cavity for the R enantiomer, while the S enantiomer was less interactive.
36

Hu, Ting, Hong Liu, Xuebao Guo, Yuxin Yuan, and Zhiyang Wang. "Analysis of direction-decomposed and vector-based elastic reverse time migration using the Hilbert transform." GEOPHYSICS 84, no. 6 (November 1, 2019): S599—S617. http://dx.doi.org/10.1190/geo2018-0324.1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Straightforward implementations of elastic reverse time migration (ERTM) often produce imaging artifacts associated with incorrectly imaged mode conversions, crosstalk, and back-scattered energies. To address these issues, we introduced three approaches: (1) vector-based normalized crosscorrelation imaging conditions (VBNICs), (2) directional separation of wavefields to remove low-wavenumber noise, and (3) postimaging filtering of the dip-angle gathers to eliminate the artifacts caused by nonphysical wave modes. These approaches are combined to create an effective ERTM workflow that can produce high-quality images. Numerical examples demonstrate that, first, VBNICs can produce correct polarities for PP/PS images and can compute migrated dip-angle gathers efficiently by using P/S decomposed Poynting vectors. Second, they achieve improved signal-to-noise and higher resolution when performing up/down decomposition before applying VBNICs, and left/right decomposition enhances steep dips imaging at the computational cost of adding the Hilbert transform to a spatial direction. Third, dip filtering using slope-consistency analysis attenuates the remaining artifacts effectively. An application of the SEG advanced modeling program (SEAM) model demonstrates that our ERTM workflow reduces noise and improves imaging ability for complex geologic areas.
37

Coorey, Benjamin P., and Julie R. Jupp. "Generative spatial performance design system." Artificial Intelligence for Engineering Design, Analysis and Manufacturing 28, no. 3 (July 22, 2014): 277–83. http://dx.doi.org/10.1017/s0890060414000225.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
AbstractArchitectural spatial design is a wicked problem that can have a multitude of solutions for any given brief. The information needed to resolve architectural design problems is often not readily available during the early conceptual stages, requiring proposals to be evaluated only after an initial solution is reached. This “solution-driven” design approach focuses on the generation of designs as a means to explore the solution space. Generative design can be achieved computationally through parametric and algorithmic processes. However, utilizing a large repertoire of organiational patterns and design precedent knowledge together with the precise criteria of spatial evaluation can present design challenges even to an experienced architect. In the implementation of a parametric design process lies an opportunity to supplement the designer's knowledge with computational decision support that provides real-time spatial feedback during conceptual design. This paper presents an approach based on a generative multiperformance framework, configured for generating and optimizing architectural designs based on a precedent design. The system is constructed using a parametric modeling environment enabling the capture of precedent designs, extraction of spatial analytics, and demonstration of how populations can be used to drive the generation and optimization of alternate spatial solutions. A pilot study implementing the complete workflow of the system is used to illustrate the benefits of coupling parametric modeling with structured precedent analysis and design generation.
38

Kartashov, Sergey, Yuri Kozhukhov, Vycheslav Ivanov, Aleksei Danilishin, Aleksey Yablokov, Aleksey Aksenov, Ivan Yanin, and Minh Hai Nguyen. "The Problem of Accounting for Heat Exchange between the Flow and the Flow Part Surfaces When Modeling a Viscous Flow in Low-Flow Stages of a Centrifugal Compressor." Applied Sciences 10, no. 24 (December 21, 2020): 9138. http://dx.doi.org/10.3390/app10249138.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
In this paper, we review the problem of accounting for heat exchange between the flow and the flow part surfaces when creating a calculation model for modeling the workflow process of low-flow stages of a centrifugal compressor using computational fluid dynamics (CFD). The objective selected for this study was a low-flow intermediate type stage with the conditional flow coefficient Փ = 0.008 and the relative width at the impeller exit b2/D2 = 0.0133. We show that, in the case of modeling with widespread adiabatic wall simplification, the calculated temperature in the gaps between the impeller and the stator elements is significantly overestimated. Modeling of the working process in the flow part was carried out with a coupled heat exchanger, as well as with simplified accounting for heat transfer by setting the temperatures of the walls. The gas-dynamic characteristics of the stage were compared with the experimental data, the heat transfer influence on the disks friction coefficient was estimated, and the temperature distributions in the gaps between disks and in the flow part of the stage were analyzed. It is shown that the main principle when modeling the flow in low-flow stage is to ensure correct temperature distribution in the gaps.
39

Hoover, Alexander P., Joost Daniels, Janna C. Nawroth, and Kakani Katija. "A Computational Model for Tail Undulation and Fluid Transport in the Giant Larvacean." Fluids 6, no. 2 (February 20, 2021): 88. http://dx.doi.org/10.3390/fluids6020088.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Flexible propulsors are ubiquitous in aquatic and flying organisms and are of great interest for bioinspired engineering. However, many animal models, especially those found in the deep sea, remain inaccessible to direct observation in the laboratory. We address this challenge by conducting an integrative study of the giant larvacean, an invertebrate swimmer and “fluid pump” of the mesopelagic zone. We demonstrate a workflow involving deep sea robots, advanced imaging tools, and numerical modeling to assess the kinematics and resulting fluid transport of the larvacean’s beating tail. A computational model of the tail was developed to simulate the local fluid environment and the tail kinematics using embedded passive (elastic) and active (muscular) material properties. The model examines how varying the extent of muscular activation affects the resulting kinematics and fluid transport rates. We find that muscle activation in two-thirds of the tail’s length, which corresponds to the observed kinematics in giant larvaceans, generates a greater average downstream flow speed than other designs with the same power input. Our results suggest that the active and passive material properties of the larvacean tail are tuned to produce efficient fluid transport for swimming and feeding, as well as provide new insight into the role of flexibility in biological propulsors.
40

Rivera, Diego A., Renaud A. Danhaive, and Caitlin T. Mueller. "Adaptive Framework for Structural Pattern Optimization." Journal of the International Association for Shell and Spatial Structures 61, no. 4 (December 1, 2020): 296–306. http://dx.doi.org/10.20898/j.iass.2020.012.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This research outlines a new computational workflow for the design and optimization of patterned, perforated surface structures. Well-designed surface structures can be highly efficient on their own, but their potential for structural efficiency can be notably improved by deliberately introducing specific aperture patterns. Patterned surfaces can also be used to produce more stimulating architectural environments, and even those of increasing complexity can be realized thanks to recent developments in digital fabrication. With this said, designers currently lack a streamlined and rigorous approach for the exploration of patterned surface structures. This research aims to address this issue by advancing a recent work that employs NURBS-based isogeometric analysis to integrate structural analysis into an accessible CAD modeling platform. Specifically, this paper proposes an adaptive pattern optimization framework formulated to save designers appreciable computational time. Not only does this framework offer a way to quickly visualize various design solutions, but it also provides the designer an important opportunity for design interaction and control over design evolution, in turn lending itself as a versatile tool for the exploration and conceptual design of patterned surface structures.
41

Suplatov, Dmitry, Nina Popova, Sergey Zhumatiy, Vladimir Voevodin, and Vytas Švedas. "Parallel workflow manager for non-parallel bioinformatic applications to solve large-scale biological problems on a supercomputer." Journal of Bioinformatics and Computational Biology 14, no. 02 (April 2016): 1641008. http://dx.doi.org/10.1142/s0219720016410080.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Rapid expansion of online resources providing access to genomic, structural, and functional information associated with biological macromolecules opens an opportunity to gain a deeper understanding of the mechanisms of biological processes due to systematic analysis of large datasets. This, however, requires novel strategies to optimally utilize computer processing power. Some methods in bioinformatics and molecular modeling require extensive computational resources. Other algorithms have fast implementations which take at most several hours to analyze a common input on a modern desktop station, however, due to multiple invocations for a large number of subtasks the full task requires a significant computing power. Therefore, an efficient computational solution to large-scale biological problems requires both a wise parallel implementation of resource-hungry methods as well as a smart workflow to manage multiple invocations of relatively fast algorithms. In this work, a new computer software mpiWrapper has been developed to accommodate non-parallel implementations of scientific algorithms within the parallel supercomputing environment. The Message Passing Interface has been implemented to exchange information between nodes. Two specialized threads — one for task management and communication, and another for subtask execution — are invoked on each processing unit to avoid deadlock while using blocking calls to MPI. The mpiWrapper can be used to launch all conventional Linux applications without the need to modify their original source codes and supports resubmission of subtasks on node failure. We show that this approach can be used to process huge amounts of biological data efficiently by running non-parallel programs in parallel mode on a supercomputer. The C++ source code and documentation are available from http://biokinet.belozersky.msu.ru/mpiWrapper .
42

Zhao, Hu, and Julia Kowalski. "Topographic uncertainty quantification for flow-like landslide models via stochastic simulations." Natural Hazards and Earth System Sciences 20, no. 5 (May 26, 2020): 1441–61. http://dx.doi.org/10.5194/nhess-20-1441-2020.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract. Digital elevation models (DEMs) representing topography are an essential input for computational models capable of simulating the run-out of flow-like landslides. Yet, DEMs are often subject to error, a fact that is mostly overlooked in landslide modeling. We address this research gap and investigate the impact of topographic uncertainty on landslide run-out models. In particular, we will describe two different approaches to account for DEM uncertainty, namely unconditional and conditional stochastic simulation methods. We investigate and discuss their feasibility, as well as whether DEM uncertainty represented by stochastic simulations critically affects landslide run-out simulations. Based upon a historic flow-like landslide event in Hong Kong, we present a series of computational scenarios to compare both methods using our modular Python-based workflow. Our results show that DEM uncertainty can significantly affect simulation-based landslide run-out analyses, depending on how well the underlying flow path is captured by the DEM, as well as on further topographic characteristics and the DEM error's variability. We further find that, in the absence of systematic bias in the DEM, a performant root-mean-square-error-based unconditional stochastic simulation yields similar results to a computationally intensive conditional stochastic simulation that takes actual DEM error values at reference locations into account. In all other cases the unconditional stochastic simulation overestimates the variability in the DEM error, which leads to an increase in the potential hazard area as well as extreme values of dynamic flow properties.
43

Bernardet, Ligia, Laurie Carson, and Vijay Tallapragada. "The Design of a Modern Information Technology Infrastructure to Facilitate Research-to-Operations Transition for NCEP’s Modeling Suites." Bulletin of the American Meteorological Society 98, no. 5 (May 1, 2017): 899–904. http://dx.doi.org/10.1175/bams-d-15-00139.1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract NOAA/NCEP runs a number of numerical weather prediction (NWP) modeling suites to provide operational guidance to the National Weather Service field offices and service centers. A sophisticated infrastructure, which includes a complex set of software tools, is required to facilitate running these NWP suites. This infrastructure needs to be maintained and upgraded so that continued improvements in forecast accuracy can be achieved. This contribution describes the design of a robust NWP Information Technology Environment (NITE) to support and accelerate the transition of innovations to NOAA operational modeling suites. Through consultation with and at the request of the NOAA NCEP Environmental Modeling Center, a survey of segments of the national NWP community, and a review of selected aspects of the computational infrastructure of several modeling centers was conducted, which led to the following elements being considered as key for NITE: data management, source code management and build systems, suite definition tools, scripts, workflow management, experiment database, and documentation and training. The design for NITE put forth by the DTC would make model development by NOAA staff and their external collaborators more effective and efficient. It should be noted that NITE was not designed to work exclusively for a certain modeling suite; instead it transcends the current operational suites and is applicable to the expected evolution in NCEP systems. NITE is particularly important for community engagement in the Next-Generation Global Prediction System, which is expected to be an Earth modeling system including several components.
44

Alkadri, Miktha Farid, Francesco De Luca, Michela Turrin, and Sevil Sariyildiz. "A Computational Workflow for Generating A Voxel-Based Design Approach Based on Subtractive Shading Envelopes and Attribute Information of Point Cloud Data." Remote Sensing 12, no. 16 (August 9, 2020): 2561. http://dx.doi.org/10.3390/rs12162561.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This study proposes a voxel-based design approach based on the subtractive mechanism of shading envelopes and attributes information of point cloud data in tropical climates. In particular, the proposed method evaluates a volumetric sample of new buildings based on predefined shading performance criteria. With the support of geometric and radiometric information stored in point cloud, such as position (XYZ), color (RGB), and reflection intensity (I), an integrated computational workflow between passive design strategy and 3D scanning technology is developed. It aims not only to compensate for some pertinent aspects of the current 3D site modeling, such as vegetation and surrounding buildings, but also to investigate surface characteristics of existing contexts, such as visible sun vectors and material properties. These aspects are relevant for conducting a comprehensively environmental simulation, while averting negative microclimatic impacts when locating the new building into the existing context. Ultimately, this study may support architects for taking decision-making in conceptual design stage based on the real contextual conditions.
45

Eschle, Jonas, Albert Navarro Puig, Rafael Silva Coutinho, and Nicola Serra. "zfit: scalable pythonic fitting." EPJ Web of Conferences 245 (2020): 06025. http://dx.doi.org/10.1051/epjconf/202024506025.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Statistical modeling and fitting is a key element in most HEP analyses. This task is usually performed in the C++ based framework ROOT/RooFit. Recently the HEP community started shifting more to the Python language, which the tools above are only loose integrated into, and a lack of stable, native Python based toolkits became clear. We presented zfit, a project that aims at building a fitting ecosystem by providing a carefully designed, stable API and a workflow for libraries to communicate together with an implementation fully integrated into the Python ecosystem. It is built on top of one of the state-of-theart industry tools, TensorFlow, which is used the main computational backend. zfit provides data loading, extensive model building capabilities, loss creation, minimization and certain error estimation. Each part is also provided with convenient base classes built for customizability and extendability.
46

Musselman, Eric D., Jake E. Cariello, Warren M. Grill, and Nicole A. Pelot. "ASCENT (Automated Simulations to Characterize Electrical Nerve Thresholds): A pipeline for sample-specific computational modeling of electrical stimulation of peripheral nerves." PLOS Computational Biology 17, no. 9 (September 7, 2021): e1009285. http://dx.doi.org/10.1371/journal.pcbi.1009285.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Electrical stimulation and block of peripheral nerves hold great promise for treatment of a range of disease and disorders, but promising results from preclinical studies often fail to translate to successful clinical therapies. Differences in neural anatomy across species require different electrodes and stimulation parameters to achieve equivalent nerve responses, and accounting for the consequences of these factors is difficult. We describe the implementation, validation, and application of a standardized, modular, and scalable computational modeling pipeline for biophysical simulations of electrical activation and block of nerve fibers within peripheral nerves. The ASCENT (Automated Simulations to Characterize Electrical Nerve Thresholds) pipeline provides a suite of built-in capabilities for user control over the entire workflow, including libraries for parts to assemble electrodes, electrical properties of biological materials, previously published fiber models, and common stimulation waveforms. We validated the accuracy of ASCENT calculations, verified usability in beta release, and provide several compelling examples of ASCENT-implemented models. ASCENT will enable the reproducibility of simulation data, and it will be used as a component of integrated simulations with other models (e.g., organ system models), to interpret experimental results, and to design experimental and clinical interventions for the advancement of peripheral nerve stimulation therapies.
47

Xue, Xu, Changdong Yang, Tsubasa Onishi, Michael J. King, and Akhil Datta–Gupta. "Modeling Hydraulically Fractured Shale Wells Using the Fast-Marching Method With Local Grid Refinements and an Embedded Discrete Fracture Model." SPE Journal 24, no. 06 (July 29, 2019): 2590–608. http://dx.doi.org/10.2118/193822-pa.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Summary Recently, fast–marching–method (FMM) –based flow simulation has shown great promise for rapid modeling of unconventional oil and gas reservoirs. Currently, the application of FMM–based simulation has been limited to using tartan grids to model the hydraulic fractures (HFs). The use of tartan grids adversely impacts the computational efficiency, particularly for field–scale applications with hundreds of HFs. Our purpose in this paper is to extend FMM–based simulation to incorporate local grid refinements (LGRs) and an embedded discrete fracture model (EDFM) to simulate HFs with natural fractures, and to validate the accuracy and efficiency of the methodologies. The FMM–based simulation is extended to LGRs and EDFM. This requires novel gridding through the introduction of triangles (2D) and tetrahedrons (2.5D) to link the local and global domain and solution of the Eikonal equation in unstructured grids to compute the diffusive time of flight (DTOF). The FMM–based flow simulation reduces a 3D simulation to an equivalent 1D simulation using the DTOF as a spatial coordinate. The 1D simulation can be carried out using a standard finite–difference method, thus leading to orders of magnitude of savings in computation time compared with full 3D simulation for high–resolution models. First, we validate the accuracy and computational efficiency of the FMM–based simulation with LGRs by comparing them with tartan grids. The results show good agreement and the FMM–based simulation with LGRs shows significant improvement in computational efficiency. Then, we apply the FMM–based simulation with LGRs to the case of a multistage–hydraulic–fractured horizontal well with multiphase flow, to demonstrate the practical feasibility of our proposed approach. After that, we investigate various discretization schemes for the transition between the local and global domain in the FMM–based flow simulation. The results are used to identify optimal gridding schemes to maintain accuracy while improving computational efficiency. Finally, we demonstrate the workflow of the FMM–based simulation with EDFM, including grid generation; comparison with FMM with unstructured grid; and validation of the results. The FMM with EDFM can simulate arbitrary fracture patterns without simplification and shows good accuracy and efficiency. This is the first study to apply the FMM–based flow simulation with LGRs and EDFM. The three main contributions of the proposed methodology are (i) unique mesh–generation schemes to link fracture and matrix flow domains, (ii) DTOF calculations in locally refined grids, and (iii) sensitivity studies to identify optimal discretization schemes for the FMM–based simulation.
48

Kozhukhov, Yuri, Aleksey Danilishin, Sergey Kartashov, Lyubov Gileva, Aleksey Yablokov, Vyacheslav Ivanov, and Serafima Tatchenkova. "Application of Super-Computer Technologies in the Research and Improvement of Steps and Elements of the Flow Path of the Turbocompressors." E3S Web of Conferences 220 (2020): 01078. http://dx.doi.org/10.1051/e3sconf/202022001078.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This paper presents the results of the study of the spatial flow in the turbocompressors elements of computational fluid dynamics methods using the Ansys CFX software package on a multiprocessor computer system. Five objects of research are considered: 1) flow path of an intermediate stage of an average-flow centrifugal compressor; 2) flow path of the low-flow centrifugal compressor stage; 3) a natural gas centrifugal compressor stage; 4) vaned diffuser of the first stage of an industrial multistage centrifugal compressor; 5) adjustable inlet stator of the first stage of an industrial turbocompressors. Generally at manufacturing new centrifugal compressors, it is impossible to make a control measurement of the parameters of the working process inside the flow path elements. Computational fluid dynamics methods are widely used to overcome this difficulties. However verification and validation of CFD methods are necessary for accurate modeling of the workflow. All calculations were performed on one of the SPbPU clusters. Parameters of one cluster: AMD Opteron 280 4 cores. The calculations were carried out with parallel running of the processors: HP MPI Distributed Parallel and HP MPI Local Parallel for different objects.
49

Gogovi, Gideon K., Fahad Almsned, Nicole Bracci, Kylene Kehn-Hall, Amarda Shehu, and Estela Blaisten-Barojas. "Modeling the Tertiary Structure of the Rift Valley Fever Virus L Protein." Molecules 24, no. 9 (May 7, 2019): 1768. http://dx.doi.org/10.3390/molecules24091768.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
A tertiary structure governs, to a great extent, the biological activity of a protein in the living cell and is consequently a central focus of numerous studies aiming to shed light on cellular processes central to human health. Here, we aim to elucidate the structure of the Rift Valley fever virus (RVFV) L protein using a combination of in silico techniques. Due to its large size and multiple domains, elucidation of the tertiary structure of the L protein has so far challenged both dry and wet laboratories. In this work, we leverage complementary perspectives and tools from the computational-molecular-biology and bioinformatics domains for constructing, refining, and evaluating several atomistic structural models of the L protein that are physically realistic. All computed models have very flexible termini of about 200 amino acids each, and a high proportion of helical regions. Properties such as potential energy, radius of gyration, hydrodynamics radius, flexibility coefficient, and solvent-accessible surface are reported. Structural characterization of the L protein enables our laboratories to better understand viral replication and transcription via further studies of L protein-mediated protein–protein interactions. While results presented a focus on the RVFV L protein, the following workflow is a more general modeling protocol for discovering the tertiary structure of multidomain proteins consisting of thousands of amino acids.
50

Argentati, Chiara, Ilaria Tortorella, Martina Bazzucchi, Francesco Morena, and Sabata Martino. "Harnessing the Potential of Stem Cells for Disease Modeling: Progress and Promises." Journal of Personalized Medicine 10, no. 1 (February 6, 2020): 8. http://dx.doi.org/10.3390/jpm10010008.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Ex vivo cell/tissue-based models are an essential step in the workflow of pathophysiology studies, assay development, disease modeling, drug discovery, and development of personalized therapeutic strategies. For these purposes, both scientific and pharmaceutical research have adopted ex vivo stem cell models because of their better predictive power. As matter of a fact, the advancing in isolation and in vitro expansion protocols for culturing autologous human stem cells, and the standardization of methods for generating patient-derived induced pluripotent stem cells has made feasible to generate and investigate human cellular disease models with even greater speed and efficiency. Furthermore, the potential of stem cells on generating more complex systems, such as scaffold-cell models, organoids, or organ-on-a-chip, allowed to overcome the limitations of the two-dimensional culture systems as well as to better mimic tissues structures and functions. Finally, the advent of genome-editing/gene therapy technologies had a great impact on the generation of more proficient stem cell-disease models and on establishing an effective therapeutic treatment. In this review, we discuss important breakthroughs of stem cell-based models highlighting current directions, advantages, and limitations and point out the need to combine experimental biology with computational tools able to describe complex biological systems and deliver results or predictions in the context of personalized medicine.

До бібліографії