Academic literature on the topic 'Computer process variables'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Computer process variables.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Computer process variables"

1

Turng, L.-S., and M. Peić. "Computer aided process and design optimization for injection moulding." Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture 216, no. 12 (December 1, 2002): 1523–32. http://dx.doi.org/10.1243/095440502321016288.

Full text
Abstract:
Sophisticated computer aided engineering (CAE) simulation tools for injection moulding have been available and are now widely used in industrial practices. As a result, the design and manufacturing of injection-moulded parts have been literally transformed from a ‘black art’ to an engineering discipline based on scientific principles. It is well recognized that computer simulation tools help engineers to gain process insight and to pinpoint blind spots and problems that are overlooked. Nevertheless, there remains a missing link in CAE, which lies in the ability to identify effectively the optimal design and process variables, as it is hampered by the sheer amount of computer-generated data and complex non-linear interactions among those input variables. This paper presents the system implementation and experimental verifications of an integrated CAE optimization tool that couples a process simulation program with optimization algorithms to determine intelligently and automatically the optimal design and process variables for injection moulding. In addition, this study enables evaluation and comparison of various local and global optimization algorithms in terms of computational efficiency and effectiveness for injection moulding, as presented in this paper.
APA, Harvard, Vancouver, ISO, and other styles
2

Shamasundar, S., A. G. Marathe, and S. K. Biswas. "Effect of Process Variables on Die-Billet Temperature History in a Slow Speed Hot Coining Type Process." Journal of Engineering for Industry 113, no. 4 (November 1, 1991): 362–72. http://dx.doi.org/10.1115/1.2899709.

Full text
Abstract:
A computer code is developed as a part of an ongoing project on computer aided process modelling of forging operation, to simulate heat transfer in a die-billet system. The code developed on a stage-by-stage technique is based on an Alternating Direction Implicit scheme. The experimentally validated code is used to study the effect of process specifics such as preheat die temperature, machine ascent time, rate of deformation, and dwell time on the thermal characteristics in a batch coining operation where deformation is restricted to surface level only.
APA, Harvard, Vancouver, ISO, and other styles
3

Offermans, Tim, Ewa Szymańska, Lutgarde M. C. Buydens, and Jeroen J. Jansen. "Synchronizing process variables in time for industrial process monitoring and control." Computers & Chemical Engineering 140 (September 2020): 106938. http://dx.doi.org/10.1016/j.compchemeng.2020.106938.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Williams-Green, Joyce, Glen Holmes, and Thomas M. Sherman. "Culture as a Decision Variable for Designing Computer Software." Journal of Educational Technology Systems 26, no. 1 (September 1997): 3–18. http://dx.doi.org/10.2190/ljuq-19h1-ulkc-dt1h.

Full text
Abstract:
Computer software design generally follows a systematic process that addresses decisions on variables known to influence learning success. We propose that culture should be included as part of this decision process. Culture represents the complex of social, emotional, intellectual, physical, and personal factors that individuals use to create meaning. When these cognitive anchors are missing from instructional materials, achievement may be imperiled. Examples of how cultural variables can be incorporated into instructional decisions are presented to illustrate the potential for enhancing software.
APA, Harvard, Vancouver, ISO, and other styles
5

Slišković, Dražen, Ratko Grbić, and Željko Hocenski. "Adaptive Estimation of Difficult-to-Measure Process Variables." Automatika 54, no. 2 (January 2013): 166–77. http://dx.doi.org/10.7305/automatika.54-2.147.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Watson, J. Allen, Michelle I. Eichhorn, and John Scanzoni. "A Home/University Computer Network: Test of a System to Study Families." Journal of Educational Technology Systems 17, no. 4 (June 1989): 319–35. http://dx.doi.org/10.2190/j7uk-5bax-ccb0-p9yf.

Full text
Abstract:
The primary purpose of this study was to produce a new computer-based research paradigm designed to test family process variables. Twenty-nine males representing twenty-nine homes each with a microcomputer and modem served as subjects across a two-month period. A microcomputer/mainframe system was developed and integrated with a conceptual model used to test family decision-making variables. Nine subtests used in the conceptual model served as process variables in this study. Attitude questions concerning gender role preferences, religious commitment, empathy toward spouse, marital commitment, perception of spousal conflicts, degree of individualism, and self-esteem were presented and recorded via university mainframe from home computers. Data were analyzed across two test battery replication (two months). Data showed that the integration of an existing family process conceputal model and the microcomputer/mainframe system could be used as a new research paradigm, that the two months testing provided strong support for paradigm efficiency, and that the paradigm proved to be highly reliable and valid.
APA, Harvard, Vancouver, ISO, and other styles
7

Fonseca, Ijar M., and Peter M. Bainum. "Integrated Structural and Control Optimization." Journal of Vibration and Control 10, no. 10 (October 2004): 1377–91. http://dx.doi.org/10.1177/1077546304042043.

Full text
Abstract:
This paper focuses on the integrated structural/control optimization of a large space structure with a robot arm subject to the gravity-gradient torque through a semi-analytical approach. It is well known that the computer effort to compute numerically derivatives of the constraints with respect to design variables makes the process expensive and time-consuming. In this sense, a semi-analytical approach may represent a good alternative when optimizing systems that require sensitivity calculations with respect to design parameters. In this study, constraints from the structure and control disciplines are imposed on the optimization process with the aim of obtaining the structure’s minimum weight and the optimum control performance. In the process optimization, the sensitivity of the constraints is computed by a semi-analytical approach. This approach combines the use of analytical derivatives of the mass and stiffness matrices with the numerical solution of the eigenvalue problem to obtain the eigenvalue derivative with respect to the design variables. The analytical derivatives are easy to obtain since our space structure is a long one-dimensional beam-like spacecraft.
APA, Harvard, Vancouver, ISO, and other styles
8

Hofmann, N., S. Olive, G. Laschet, F. Hediger, J. Wolf, and P. R. Sahm. "Numerical optimization of process control variables for the Bridgman casting process." Modelling and Simulation in Materials Science and Engineering 5, no. 1 (January 1, 1997): 23–34. http://dx.doi.org/10.1088/0965-0393/5/1/002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Thoreson, Curtis, Keith Webster, Matthew Darr, and Emily Kapler. "Investigation of Process Variables in the Densification of Corn Stover Briquettes." Energies 7, no. 6 (June 24, 2014): 4019–32. http://dx.doi.org/10.3390/en7064019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Boschetti, F., F. M. Montevecchi, and R. Fumero. "Virtual Extracorporeal Circulation Process." International Journal of Artificial Organs 20, no. 6 (June 1997): 341–51. http://dx.doi.org/10.1177/039139889702000608.

Full text
Abstract:
Virtual instruments for an extracorporeal circulation (ECC) process were developed to simulate the reactions of a patient to different artificial perfusion conditions. The computer simulation of the patient takes into account the hydraulic, volume, thermal and biochemical phenomena and their interaction with the devices involved in ECC (cannulae dimensions, oxygenator and filter types, pulsatile or continuous pump and thermal exchangers). On the basis of the patient's initialisation data (height, weight, Ht) and perfusion variables (pump flow rate, water temperature, gas flow rate and composition) imposed by the operator, the virtual ECC monitors simulated arterial and venous pressure tracings in real time, along with arterial and venous flow rate tracings, urine production tracing and temperature levels. Oxyhemoglobin arterial and venous blood saturation together with other related variables (pO2, pCO2, pH, HCO3) are also monitored. A drug model which allows the simulation of the effect of vasodilator and diuretic drugs is also implemented. Alarms are provided in order to check which variables (pressure, saturation, pH, urine flow) are out of the expected ranges during the ECC simulation. Consequently the possibility of modifying the control parameters of the virtual devices of the ECC in run-time mode offers an interaction mode between the operator and the virtual environment.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Computer process variables"

1

Marque-Pucheu, Sophie. "Gaussian process regression of two nested computer codes." Thesis, Sorbonne Paris Cité, 2018. http://www.theses.fr/2018USPCC155/document.

Full text
Abstract:
Cette thèse traite de la métamodélisation (ou émulation) par processus gaussien de deux codes couplés. Le terme « deux codes couplés » désigne ici un système de deux codes chaînés : la sortie du premier code est une des entrées du second code. Les deux codes sont coûteux. Afin de réaliser une analyse de sensibilité de la sortie du code couplé, on cherche à construire un métamodèle de cette sortie à partir d'un faible nombre d'observations. Trois types d'observations du système existent : celles de la chaîne complète, celles du premier code uniquement, celles du second code uniquement.Le métamodèle obtenu doit être précis dans les zones les plus probables de l'espace d'entrée.Les métamodèles sont obtenus par krigeage universel, avec une approche bayésienne.Dans un premier temps, le cas sans information intermédiaire, avec sortie scalaire, est traité. Une méthode innovante de définition de la fonction de la moyenne du processus gaussien, basée sur le couplage de deux polynômes, est proposée. Ensuite le cas avec information intermédiaire est traité. Un prédicteur basé sur le couplage des prédicteurs gaussiens associés aux deux codes est proposé. Des méthodes pour évaluer rapidement la moyenne et la variance du prédicteur obtenu sont proposées. Les résultats obtenus pour le cas scalaire sont ensuite étendus au cas où les deux codes sont à sortie de grande dimension. Pour ce faire, une méthode de réduction de dimension efficace de la variable intermédiaire de grande dimension est proposée pour faciliter la régression par processus gaussien du deuxième code.Les méthodes proposées sont appliquées sur des exemples numériques
Three types of observations of the system exist: those of the chained code, those of the first code only and those of the second code only. The surrogate model has to be accurate on the most likely regions of the input domain of the nested code.In this work, the surrogate models are constructed using the Universal Kriging framework, with a Bayesian approach.First, the case when there is no information about the intermediary variable (the output of the first code) is addressed. An innovative parametrization of the mean function of the Gaussian process modeling the nested code is proposed. It is based on the coupling of two polynomials.Then, the case with intermediary observations is addressed. A stochastic predictor based on the coupling of the predictors associated with the two codes is proposed.Methods aiming at computing quickly the mean and the variance of this predictor are proposed. Finally, the methods obtained for the case of codes with scalar outputs are extended to the case of codes with high dimensional vectorial outputs.We propose an efficient dimension reduction method of the high dimensional vectorial input of the second code in order to facilitate the Gaussian process regression of this code. All the proposed methods are applied to numerical examples
APA, Harvard, Vancouver, ISO, and other styles
2

Robinson, Ryan Patrick. "The Effect of Individual Differences on Training Process Variables in a Multistage Computer-Based Training Context." University of Akron / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=akron1238431328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Han, Gang. "Modeling the output from computer experiments having quantitative and qualitative input variables and its applications." Columbus, Ohio : Ohio State University, 2008. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1228326460.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Katz, Ariel. "Improvement of chemical plant performance by analysing the main variables that affect the process while using statistic methods, neural networks and genetic programming." Thesis, University of Exeter, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.302659.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

LAM, CHEN QUIN. "Sequential Adaptive Designs In Computer Experiments For Response Surface Model Fit." The Ohio State University, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=osu1211911211.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Karipidou, Kelly. "Modelling the body language of a musical conductor using Gaussian Process Latent Variable Models." Thesis, KTH, Datorseende och robotik, CVAP, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-176101.

Full text
Abstract:
Motion capture data of a musical conductor's movements when conducting a string quartet is analysed in this work using the Gaussian Process Latent Variable Model (GP-LVM) framework. A dimensionality reduction on the high dimensional motion capture data to a two dimensional representation using a GP-LVM is performed, followed by classification of conduction movements belonging to different interpretations of the same musical piece. A dynamical prior is used for the GP-LVM, resulting in a representative latent space for the sequential conduction motion data. Classification results with great performance for some of the interpretations are obtained. The GP-LVM with dynamical prior distribution is shown to be a reasonable choice when wanting to model conduction data, opening up the possibility for creating for example a "conduct-your-own-orchestra" system in a principled mathematical way, in the future.
APA, Harvard, Vancouver, ISO, and other styles
7

Qian, Zhiguang. "Computer experiments [electronic resource] : design, modeling and integration /." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11480.

Full text
Abstract:
The use of computer modeling is fast increasing in almost every scientific, engineering and business arena. This dissertation investigates some challenging issues in design, modeling and analysis of computer experiments, which will consist of four major parts. In the first part, a new approach is developed to combine data from approximate and detailed simulations to build a surrogate model based on some stochastic models. In the second part, we propose some Bayesian hierarchical Gaussian process models to integrate data from different types of experiments. The third part concerns the development of latent variable models for computer experiments with multivariate response with application to data center temperature modeling. The last chapter is devoted to the development of nested space-filling designs for multiple experiments with different levels of accuracy.
APA, Harvard, Vancouver, ISO, and other styles
8

(9896135), BM Huang. "Computer model of the shaft kiln process at Queensland Magnesia (Operations) Pty. Ltd." Thesis, 1999. https://figshare.com/articles/thesis/Computer_model_of_the_shaft_kiln_process_at_Queensland_Magnesia_Operations_Pty_Ltd_/13459442.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

FanZhang and 張凡. "Bayesian Indicator Variable Selection in Gaussian Process for Computer Experiments." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/h4ne4v.

Full text
Abstract:
碩士
國立成功大學
統計學系
106
In the past three decades, the analysis of computer experiments has received a lot of attention and plays a more and more important role in solving different scientific and engineering problems. In this thesis, we are interested in the variable selection problems for Gaussian process model. In computer experiment here, we not only focus on the mean function, but also take covariance structure into account. To accomplish our goal, indicators are added into the model to denote if the variables are active or not. Two Bayesian variable selection algorithms are proposed. In addition to the simulation studies, several real examples are also used to illustrate the of the proposed methods.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Computer process variables"

1

Sponton, Luca. From manufacturing variabiity to process-aware circuit simulation. Konstanz: Hartung-Gorre, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kogan, Efim, and Galina Zhukova. Theory of functions of a complex variable and operational calculus. ru: INFRA-M Academic Publishing LLC., 2020. http://dx.doi.org/10.12737/1058889.

Full text
Abstract:
The textbook contains theoretical information in a volume of the lecture course are discussed in detail and examples of typical tasks and test tasks and tasks for independent work. Designed for students studying in areas of training 01.03.02 "Applied mathematics and Informatics", 15.03.03 "Applied mechanics" 10.05.03 "Information security of automated systems" 09.03.01 "computer science", 15.03.01 mechanical engineering, 15.03.04 "automation of technological processes and production", 27.03.04 "Management in technical systems". Can be used by teachers for conducting practical classes.
APA, Harvard, Vancouver, ISO, and other styles
3

Langevin, Christian D. MODFLOW-2000, the U.S. Geological Survey modular ground-water model--documentation of the SEAWAT-2000 version with the variable-density flow process (VDF) and the integrated MT3DMS transport process (IMT). Tallahassee, Fla. (2010 Levy Avenue, Tallahassee 32310): U.S. Dept. of the Interior, U.S. Geological Survey, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Langevin, Christian D. Modflow-2000: The U.S. Geological Survey modular ground-water model--documentation of the SEAWAT-2000 version with the variable-density flow process (VDF) and the integrated MT3DMS transport process (IMT). Tallahassee, Fla. (2010 Levy Avenue, Tallahassee 32310): U.S. Dept. of the Interior, U.S. Geological Survey, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Langevin, Christian D. MODFLOW-2000, the U.S. Geological Survey modular ground-water model--documentation of the SEAWAT-2000 version with the variable-density flow process (VDF) and the integrated MT3DMS transport process (IMT). Tallahassee, Fla. (2010 Levy Avenue, Tallahassee 32310): U.S. Dept. of the Interior, U.S. Geological Survey, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Langevin, Christian D. MODFLOW-2000, the U.S. Geological Survey modular ground-water model--documentation of the SEAWAT-2000 version with the variable-density flow process (VDF) and the integrated MT3DMS transport process (IMT). Tallahassee, Fla. (2010 Levy Avenue, Tallahassee 32310): U.S. Dept. of the Interior, U.S. Geological Survey, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Langevin, Christian D. MODFLOW-2000, the U.S. Geological Survey modular ground-water model--documentation of the SEAWAT-2000 version with the variable-density flow process (VDF) and the integrated MT3DMS transport process (IMT). Tallahassee, Fla. (2010 Levy Avenue, Tallahassee 32310): U.S. Dept. of the Interior, U.S. Geological Survey, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Langevin, Christian D. MODFLOW-2000, the U.S. Geological Survey modular ground-water model--documentation of the SEAWAT-2000 version with the variable-density flow process (VDF) and the integrated MT3DMS transport process (IMT). Tallahassee, Fla. (2010 Levy Avenue, Tallahassee 32310): U.S. Dept. of the Interior, U.S. Geological Survey, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Langevin, Christian D. MODFLOW-2000, the U.S. Geological Survey modular ground-water model--documentation of the SEAWAT-2000 version with the variable-density flow process (VDF) and the integrated MT3DMS transport process (IMT). Tallahassee, Fla. (2010 Levy Avenue, Tallahassee 32310): U.S. Dept. of the Interior, U.S. Geological Survey, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Novikov, Anatoliy, Tat'yana Solodkaya, Aleksandr Lazerson, and Viktor Polyak. Econometric modeling in the GRETL package. ru: INFRA-M Academic Publishing LLC., 2023. http://dx.doi.org/10.12737/1732940.

Full text
Abstract:
The tutorial describes the capabilities of the GRETL statistical package for computer data analysis and econometric modeling based on spatial data and time series. Using concrete economic examples, GRETL considers classical and generalized models of linear and nonlinear regression, methods for detecting and eliminating multicollinearity, models with variable structure, autoregressive processes, methods for testing and eliminating autocorrelation, as well as discrete choice models and systems of simultaneous equations. For the convenience of users, the tutorial contains all the task data files used in the work in the format .GDTs are collected in an application in the cloud so that users have access to them. Meets the requirements of the federal state educational standards of higher education of the latest generation in the disciplines of "Econometrics" and "Econometric modeling". For students and teachers of economic universities in the field of Economics, as well as researchers who use econometric methods to model socio-economic phenomena and processes.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Computer process variables"

1

do Prado, Hércules Antonio, Fábio Bianchi Campos, Edilson Ferneda, Nildo Nunes Cornelio, and Aluizio Haendchen Filho. "Prediction of Software Quality Based on Variables from the Development Process." In Lecture Notes in Computer Science, 71–77. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-37343-5_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Zhipeng, Zhang Peng, Xueqiang Zou, and Haoqi Sun. "Deep Learning Based Anomaly Detection for Muti-dimensional Time Series: A Survey." In Communications in Computer and Information Science, 71–92. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9229-1_5.

Full text
Abstract:
AbstractMulti-dimensional time series are multiple sets of variables collected in chronological order, which are the results of observing a certain potential process according to a given sampling rate. It also has the ability to describe space and time and is widely used in many fields such as system state anomaly detection. However, multi-dimensional time series have problems such as dimensional explosion and data sparseness, as well as complex pattern features such as periods and trends. Such characteristics lead to rule-based anomaly detection methods suffer from poor detection effects. In the big data scenario, deep learning method begins to be applied to anomaly detection tasks for multi-dimensional time series due to its wide coverage and strong learning ability. This work first summarizes the definitions of anomaly detection for multi-dimensional time series and the challenges it faces. Related methods are sorted out, and then the deep learning-based method is emphasized. The existing work and its advantages and disadvantages are summarized. Finally, the shortcomings of the existing algorithms are clarified and the future research direction is explored.
APA, Harvard, Vancouver, ISO, and other styles
3

Janicki, Aleksander, and Aleksander Weron. "Computer Simulation of α-Stable Random Variables." In Simulation and Chaotic Behavior of α-Stable Stochastic Processes, 35–65. Boca Raton: CRC Press, 2021. http://dx.doi.org/10.1201/9781003208877-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Martínez-Rojas, A., A. Jiménez-Ramírez, J. G. Enríquez, and H. A. Reijers. "Analyzing Variable Human Actions for Robotic Process Automation." In Lecture Notes in Computer Science, 75–90. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16103-2_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Xiao, Zedong, Junli Zhao, Xuejun Qiao, and Fuqing Duan. "Craniofacial Reconstruction Using Gaussian Process Latent Variable Models." In Computer Analysis of Images and Patterns, 456–64. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-23192-1_38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Nickisch, Hannes, and Carl Edward Rasmussen. "Gaussian Mixture Modeling with Gaussian Process Latent Variable Models." In Lecture Notes in Computer Science, 272–82. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15986-2_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Debar, Hervé, Marc Dacier, Mehdi Nassehi, and Andreas Wespi. "Fixed vs. variable-length patterns for detecting suspicious process behavior." In Computer Security — ESORICS 98, 1–15. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/bfb0055852.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Srivastava, Praveen Ranjan. "Test Process Model with Enhanced Approach of State Variable." In Communications in Computer and Information Science, 181–92. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-14825-5_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bo, Cuimei, Jun Li, Zhiquan Wang, and Jinguo Lin. "Adaptive Neural Model Based Fault Tolerant Control for Multi-variable Process." In Lecture Notes in Computer Science, 596–601. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/978-3-540-37275-2_72.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Belov, Anton, and Zbigniew Stachniak. "Improving Variable Selection Process in Stochastic Local Search for Propositional Satisfiability." In Lecture Notes in Computer Science, 258–64. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-02777-2_25.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Computer process variables"

1

Kefei, Wang, Lu Ming, and Ke Hongdi. "Corn Drying Process Variables Screening and Its Moisture Content Forecast." In 7th International Conference on Education, Management, Information and Computer Science (ICEMC 2017). Paris, France: Atlantis Press, 2017. http://dx.doi.org/10.2991/icemc-17.2017.113.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jawaha, S., and P. Ramamoorthy. "Dynamic optimization of injection molding process variables by evolutionary programming methods." In 2012 International Conference on Computer Communication and Informatics (ICCCI). IEEE, 2012. http://dx.doi.org/10.1109/iccci.2012.6158902.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Groppetti, Roberto, and Giuseppe Comi. "Contribution to Computer Control and Optimization of Hydro-Abrasive Jet Machining Process." In ASME 1991 International Computers in Engineering Conference and Exposition. American Society of Mechanical Engineers, 1991. http://dx.doi.org/10.1115/cie1991-0175.

Full text
Abstract:
Abstract Hydro-Abrasive Jet Machining (HAJM) has demonstrated its suitability for several applications in the machining of a wide spectrum of materials (metals, polymers, ceramics, fibre reinforced composites, etc.). The paper is a contribution to the computer control, integration and optimization of HAJM process in order to establish a hierarchical control architecture and a platform for the implementation of a real-time Adaptive Control Optimization (ACO) module. The paper presents the approach followed and the main results obtained during the development and implementation of a HAJM cell and its computerized controller. A critical analysis of the process variables available in the literature is presented, in order to identify the process variables and to define a process model suitable for HAJM real-time control and optimization. Besides for HAJM computer control, in order to correlate process variables and parameters with machining results, a process model and an optimization procedure are necessary in order to avoid expensive and time-consuming experiments for the determination of optimal machining conditions. The paper presents the configuration of the cell and the specific components adopted in order to make possible a fully computerized control of the process, and the architecture of the controller, capable to manage the several logical and analogical signals from the different modules of the cell, for multiprogramming, process monitoring, controlling, process parameters predetermination, process condition multiobjective optimization. A prediction and an optimization model is presented allowing the identification of optimal machining conditions using multiobjective programming. This model is based on the definition of an economy function and a productivity function, with suitable constraints relevant to the required machining quality, the required kerfing depth and the available resources. A test case based on experimental results is discussed in order to validate the model.
APA, Harvard, Vancouver, ISO, and other styles
4

Park, Sunhee, Dong Ha Kim, Ko Ryu Kim, and Song-Won Chol. "An Integration of the Restructured MELCOR for the MIDAS Computer Code." In 14th International Conference on Nuclear Engineering. ASMEDC, 2006. http://dx.doi.org/10.1115/icone14-89712.

Full text
Abstract:
The developmental need for a localized severe accident analysis code is on the rise. KAERI is developing a severe accident code called MIDAS, which is based on MELCOR. In order to develop the localized code (MIDAS) which simulates a severe accident in a nuclear power plant, the existing data structure is reconstructed for all the packages in MELCOR, which uses pointer variables for data transfer between the packages. During this process, new features in FORTRAN90 such as a dynamic allocation are used for an improved data saving and transferring method. Hence the readability, maintainability and portability of the MIDAS code have been enhanced. After the package-wise restructuring, the newly converted packages are integrated together. Depending on the data usage in the package, two types of packages can be defined: some use their own data within the package (let’s call them independent packages) and the others share their data with other packages (dependent packages). For the independent packages, the integration process is simple to link the already converted packages together. That is, the package-wise structuring does not require further conversion of variables for the integration process. For the dependent packages, extra conversion is necessary to link them together. As the package-wise restructuring converts only the corresponding package’s variables, other variables defined from other packages are not touched and remain as it is. These variables are to be converted into the new types of variables simultaneously as well as the main variables in the corresponding package. Then these dependent packages are ready for integration. In order to check whether the integration process is working well, the results from the integrated version are verified against the package-wise restructured results. Steady state runs and station blackout sequences are tested and the major variables are found to be the same each other. In order to verify the results, the integrated results were compared with the restructured results for each package. Some sequences were calculated such as a steady state and SBO (Station Blackout) accident. The major variables were the same as well as the graph trends. Through out the integrating process, the base was constructed for a code improvement and an addition of new models. The integrating process proposed in this paper will be extended to the T/H and F/P packages for the MIDAS development program.
APA, Harvard, Vancouver, ISO, and other styles
5

Sorva, Juha, Ville Karavirta, and Ari Korhonen. "Roles of Variables in Teaching." In InSITE 2007: Informing Science + IT Education Conference. Informing Science Institute, 2007. http://dx.doi.org/10.28945/3100.

Full text
Abstract:
Expert programmers possess schemas, abstractions of concrete experiences, which help them solve programming problems. Stereotypes of variable use in computer programs can be characterized using roles of variables, which can be taught to novices in order to support schema formation. This paper describes the ’lightweight’ adoption of roles of variables in our teaching of introductory programming. We discuss the changes made to our courses, our experiences with this process, and some preliminary results of how our students took to roles of variables. Roles provided us with a new way to improve our teaching materials and methods. They are easily linked to code and to pseudocode designs, and we found them easy to use both as a documentative tool and for the stepwise refinement of programs. Our results indicate that more than a very ’lightweight’ introduction of roles is needed in order for students to adopt them into active use.
APA, Harvard, Vancouver, ISO, and other styles
6

Wilczynski, K., and A. Nastaj. "SSEM-AG Computer Model for Optimization of Polymer Extrusion." In ASME 2006 International Mechanical Engineering Congress and Exposition. ASMEDC, 2006. http://dx.doi.org/10.1115/imece2006-13074.

Full text
Abstract:
The optimization of an extrusion process is a conflicting, multi-objective problem. It is complicated by the number of variables (screw/die geometry, operating conditions, material data) and their non-linear relations, as well as by the opposing criteria, for example extrusion throughput and power consumption. It is difficult to find the global optimum for the process avoiding local optima. There are two approaches to solve the problem, experimental and using a mathematical model of extrusion. Optimization techniques based on an experimentation are time-consuming and very expensive. In this paper we present an optimization methodology based on the Genetic Algorithms (AG), where response surface is given by the extrusion model. A mathematical Single-Screw Extrusion Model SSEM developed at the Warsaw University of Technology is used to predict the extruder behavior, and AG approach is used for optimization. An integrated SSEM-AG system was developed to study optimization of the single-screw extrusion process. Three design criteria (output variables) are selected for optimization: maximum extrusion throughput, minimum power consumption and low melt temperature. As input variables, screw speed, barrel temperature and screw channel depth are chosen.
APA, Harvard, Vancouver, ISO, and other styles
7

Makarova, E. A., and D. G. Lagerev. "Using Visual Modelsfor Exploratory Analysis of Semi-structured Text Data." In 32nd International Conference on Computer Graphics and Vision. Keldysh Institute of Applied Mathematics, 2022. http://dx.doi.org/10.20948/graphicon-2022-1090-1101.

Full text
Abstract:
The processing of semi-structured textual data for further use in DM models is a labor-intensive process, which, in addition to material costs, can increase the time required to build a model, and, as a result, worsen the efficiency of decision-making. This article presents visual models of semistructured text data and methods for their processing at the stage of exploratory analysis. Exploratory analysis will reduce the time to select significant variables at the initial stage of the study and, in the future, avoid the processing of redundant or insignificant variables. The use of visualization will help to include in DM model and process only data that will improve DM model quality. The process of using visualization of textual data in the process of exploratory analysis and the construction of two types of visual models is described - interactive "quantitative" visualization and visualization of relationships between words and other variables in the data under study. Approbation of the developed models is described on the example of labor market analysis. Examples of visualization of the content of the "soft skills" field from the CV and vacancies are presented, displaying both the skills most often mentioned by applicants from various professional fields, and the impact of mentioning these skills on inviting applicants for interviews. The experiment showed that the use of the developed visual models makes it possible to determine whether it is necessary to include a text variable in the DM model at the stage of exploratory analysis.
APA, Harvard, Vancouver, ISO, and other styles
8

Lu, Chi-Jie, Yuehjen E. Shao, and Yu-Chiun Wang. "Combining independent component analysis and support vector machine for identifying fault quality variables in the multivariate process." In 2010 International Symposium on Computer, Communication, Control and Automation (3CA). IEEE, 2010. http://dx.doi.org/10.1109/3ca.2010.5533794.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Rodríguez, Alicia B., Esmeralda Niño, Jose M. Castro, Marcelo Suarez, and Mauricio Cabrera. "Injection Molding Process Windows Considering Two Conflicting Criteria: Simulation Results." In ASME 2012 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/imece2012-89678.

Full text
Abstract:
In this work, two criteria in conflict are considered simultaneously to determine a process window for injection molding. The best compromises between the two criteria are identified through the application of multiple criteria optimization concepts. The aim with this work is to provide a formal and realistic strategy to set processing conditions in injection molding operations. In order to keep the main ideas manageable, the development of the strategy is constrained to two controllable variables in computer simulated parts.
APA, Harvard, Vancouver, ISO, and other styles
10

Hull, Emmett, Weston Grove, Meng Zhang, Xiaoxu Song, Z. J. Pei, and Weilong Cong. "Effects of Process Variables on Extrusion of Carbon Fiber Reinforced ABS Filament for Additive Manufacturing." In ASME 2015 International Manufacturing Science and Engineering Conference. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/msec2015-9396.

Full text
Abstract:
Additive manufacturing (3D printing) is a class of manufacturing processes where material is deposited in a layer-by-layer fashion to fabricate a three-dimensional part directly from a computer-aided design (CAD) model. With a current market share of 44%, thermoplastic-based additive manufacturing such as fused deposition modeling (FDM) is a prevailing technology. A preliminary extrusion process is required to produce thermoplastic filaments for use in FDM 3D printers. It is crucial that extruded filament must have constant dimensional accuracy for FDM 3D printers to produce the desired object with precision. In this study, carbon fibers were blended with acrylonitrile butadiene styrene (ABS) thermoplastics to produce carbon fiber reinforced ABS filaments in order to improve the mechanical properties of FDM-printed objects. During filament extrusion, three process variables showed significant effects on filament diameter, expansion percentage, and extrusion rate. These process variables included carbon fiber content, extrusion temperature, and nozzle size. The objective of this study is to test the feasible ranges of these process variables and to investigate their effects on filament extrusion. Results of this study will provide knowledge on quality improvement of carbon fiber reinforced ABS filament extrusion for additive manufacturing.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Computer process variables"

1

Seginer, Ido, James Jones, Per-Olof Gutman, and Eduardo Vallejos. Optimal Environmental Control for Indeterminate Greenhouse Crops. United States Department of Agriculture, August 1997. http://dx.doi.org/10.32747/1997.7613034.bard.

Full text
Abstract:
Increased world competition, as well as increased concern for the environment, drive all manufacturing systems, including greenhouses, towards high-precision operation. Optimal control is an important tool to achieve this goal, since it finds the best compromise between conflicting demands, such as higher profits and environmental concerns. The report, which is a collection of papers, each with its own abstract, outlines an approach for optimal, model-based control of the greenhouse environment. A reliable crop model is essential for this approach and a significant portion of the effort went in this direction, resulting in a radically new version of the tomato model TOMGRO, which can be used as a prototype model for other greenhouse crops. Truly optimal control of a very complex system requires prohibitively large computer resources. Two routes to model simplification have, therefore, been tried: Model reduction (to fewer state variables) and simplified decision making. Crop model reduction from nearly 70 state variables to about 5, was accomplished by either selecting a subset of the original variables or by forming combinations of them. Model dynamics were then fitted either with mechanistic relationships or with neural networks. To simplify the decision making process, the number of costate variables (control policy parametrs) was recuced to one or two. The dry-matter state variable was transformed in such a way that its costate became essentially constant throughout the season. A quasi-steady-state control algorithm was implemented in an experimental greenhouse. A constant value for the dry-matter costate was able to control simultaneously ventilation and CO2 enrichment by continuously producing weather-dependent optimal setpoints and then maintaining them closely.
APA, Harvard, Vancouver, ISO, and other styles
2

Nelson, Gena, Angela Crawford, and Jessica Hunt. A Systematic Review of Research Syntheses for Students with Mathematics Learning Disabilities and Difficulties. Boise State University, Albertsons Library, January 2022. http://dx.doi.org/10.18122/sped.143.boisestate.

Full text
Abstract:
The purpose of this document is to provide readers with the coding protocol that authors used to code 36 research syntheses (including meta-analyses, evidence-based reviews, and quantitative systematic reviews) focused on mathematics interventions for students with learning disabilities (LD), mathematics learning disabilities (MLD), and mathematics difficulties (MD). The purpose of the systematic review of mathematics intervention syntheses was to identify patterns and gaps in content areas, instructional strategies, effect sizes, and definitions of LD, MLD, and MD. We searched the literature for research syntheses published between 2000 and 2020 and used rigorous inclusion criteria in our literature review process. We evaluated 36 syntheses that included 836 studies with 32,495 participants. We coded each synthesis for variables across seven categories including: publication codes (authors, year, journal), inclusion and exclusion criteria, content area focus, instructional strategy focus, sample size, methodological information, and results. The mean interrater reliability across all codes using this coding protocol was 90.3%. Although each synthesis stated a focus on LD, MLD, or MD, very few students with LD or MLD were included, and authors’ operational definitions of disability and risk varied. Syntheses predominantly focused on word problem solving, fractions, computer- assisted learning, and schema-based instruction. Syntheses reported wide variation in effectiveness, content areas, and instructional strategies. Finally, our results indicate the majority of syntheses report achievement outcomes, but very few syntheses report on other outcomes (e.g., social validity, strategy use). We discuss how the results of this comprehensive review can guide researchers in expanding the knowledge base on mathematics interventions. The systematic review that results from this coding process is accepted for publication and in press at Learning Disabilities Research and Practice.
APA, Harvard, Vancouver, ISO, and other styles
3

Plueddemann, Albert, Benjamin Pietro, and Emerson Hasbrouck. The Northwest Tropical Atlantic Station (NTAS): NTAS-19 Mooring Turnaround Cruise Report Cruise On Board RV Ronald H. Brown October 14 - November 1, 2020. Woods Hole Oceanographic Institution, January 2021. http://dx.doi.org/10.1575/1912/27012.

Full text
Abstract:
The Northwest Tropical Atlantic Station (NTAS) was established to address the need for accurate air-sea flux estimates and upper ocean measurements in a region with strong sea surface temperature anomalies and the likelihood of significant local air–sea interaction on interannual to decadal timescales. The approach is to maintain a surface mooring outfitted for meteorological and oceanographic measurements at a site near 15°N, 51°W by successive mooring turnarounds. These observations will be used to investigate air–sea interaction processes related to climate variability. This report documents recovery of the NTAS-18 mooring and deployment of the NTAS-19 mooring at the same site. Both moorings used Surlyn foam buoys as the surface element. These buoys were outfitted with two Air–Sea Interaction Meteorology (ASIMET) systems. Each system measures, records, and transmits via Argos satellite the surface meteorological variables necessary to compute air–sea fluxes of heat, moisture and momentum. The upper 160 m of the mooring line were outfitted with oceanographic sensors for the measurement of temperature, salinity and velocity. Deep ocean temperature and salinity are measured at approximately 38 m above the bottom. The mooring turnaround was done on the National Oceanic and Atmospheric Administration (NOAA) Ship Ronald H. Brown, Cruise RB-20-06, by the Upper Ocean Processes Group of the Woods Hole Oceanographic Institution. The cruise took place between 14 October and 1 November 2020. The NTAS-19 mooring was deployed on 22 October, with an anchor position of about 14° 49.48° N, 51° 00.96° W in 4985 m of water. A 31-hour intercomparison period followed, during which satellite telemetry data from the NTAS-19 buoy and the ship’s meteorological sensors were monitored. The NTAS-18 buoy, which had gone adrift on 28 April 2020, was recovered on 20 October near 13° 41.96° N, 58° 38.67° W. This report describes these operations, as well as other work done on the cruise and some of the pre-cruise buoy preparations.
APA, Harvard, Vancouver, ISO, and other styles
4

Bigorre, Sebastien P., and Raymond Graham. The Northwest Tropical Atlantic Station (NTAS): NTAS-20 Mooring Turnaround Cruise Report Cruise On Board RV Pisces November 4-28, 2021 Newport, RI - Pascagoula, MS. Woods Hole Oceanographic Institution, February 2023. http://dx.doi.org/10.1575/1912/29647.

Full text
Abstract:
The Northwest Tropical Atlantic Station (NTAS) was established to address the need for accurate air-sea flux estimates and upper ocean measurements in a region with strong sea surface temperature anomalies and the likelihood of significant local air–sea interaction on interannual to decadal timescales. The approach is to maintain a surface mooring outfitted for meteorological and oceanographic measurements at a site near 15°N, 51°W by successive mooring turnarounds. These observations are used to investigate air–sea interaction processes related to climate variability. The NTAS Ocean Reference Station (ORS NTAS) is supported by the National Oceanic and Atmospheric Administration’s (NOAA) Global Ocean Monitoring and Observing (GOMO) Program (formerly Ocean Observing and Monitoring Division). This report documents recovery of the NTAS-19 mooring and deployment of the NTAS-20 mooring at the same site. Both moorings used Surlyn foam buoys as the surface element. These buoys were outfitted with two Air–Sea Interaction Meteorology (ASIMET) systems. Each system measures, records, and transmits via satellite the surface meteorological variables necessary to compute air–sea fluxes of heat, moisture and momentum. The upper 160 m of the mooring line were outfitted with oceanographic sensors for the measurement of temperature, salinity and velocity. The mooring turnaround was done by the Upper Ocean Processes Group of the Woods Hole Oceanographic Institution (WHOI), onboard R/V Pisces, Cruise PC-21-07. The cruise took place from November 4 to 28, 2021. The NTAS-20 mooring was deployed on November 12, and the NTAS-19 mooring was recovered on November 13. Limited inter-comparison between ship and buoys were performed on this cruise. This report describes these operations and the pre-cruise buoy preparations. Other operations during PC-21-07 consisted of one CTD cast near the Meridional Overturning Variability Experiment (MOVE) subsurface mooring array MOVE 1-14. MOVE is designed to monitor the integrated deep meridional flow in the tropical North Atlantic.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography