To see the other types of publications on this topic, follow the link: Heat engineering Decision support systems.

Journal articles on the topic 'Heat engineering Decision support systems'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Heat engineering Decision support systems.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Chernysheva, T. Yu, A. A. Zakharova, E. V. Telipenko, and E. V. Molnina. "Overview Information Systems for Calculating Processes Welding Stainless Steels." Materials Science Forum 938 (October 2018): 12–17. http://dx.doi.org/10.4028/www.scientific.net/msf.938.12.

Full text
Abstract:
The possibilities of using nanomaterials and nanocoatings for machine building are considered. The review of software for engineering calculations of welding processes and heat treatment is carried out. A decision support system for choosing a rational amount of nanostructured modifier powders for welding corrosion-resistant steels is proposed.
APA, Harvard, Vancouver, ISO, and other styles
2

Ma, Shou Cai. "Research on Agent-Based Energy Saving System of Civil." Advanced Materials Research 816-817 (September 2013): 1220–24. http://dx.doi.org/10.4028/www.scientific.net/amr.816-817.1220.

Full text
Abstract:
This paper deeply analyzes the urban civil system, energy-saving decision-making mechanism, the system components and the related energy-saving anti-adjustment mechanism based on the proposed energy-saving urban civil system's basis. It also presents the classification decision-making and decision-making process for the civil on various components on building systems in decision-making energy-saving features on the system proposed civil heat, urban heating network and the energy saving civil monomer decision making. It also builds the decision support for the city civil agent-based energy-saving system, realizing the basic institutions of the agent to propose the energy-saving urban civil decision.
APA, Harvard, Vancouver, ISO, and other styles
3

Rolander, Nathan, Jeffrey Rambo, Yogendra Joshi, Janet K. Allen, and Farrokh Mistree. "An Approach to Robust Design of Turbulent Convective Systems." Journal of Mechanical Design 128, no. 4 (2006): 844–55. http://dx.doi.org/10.1115/1.2202882.

Full text
Abstract:
The complex turbulent flow regimes encountered in many thermal-fluid engineering applications have proven resistant to the effective application of systematic design because of the computational expense of model evaluation and the inherent variability of turbulent systems. In this paper the integration of a novel reduced order turbulent convection modeling approach based upon the proper orthogonal decomposition technique with the application of robust design principles implemented using the compromise decision support problem is investigated as an effective design approach for this domain. In the example application considered, thermally efficient computer server cabinet configurations that are insensitive to variations in operating conditions are determined. The computer servers are cooled by turbulent convection and have unsteady heat generation and cooling air flows, yielding substantial variability, yet have some of the most stringent operational requirements of any engineering system. Results of the application of this approach to an enclosed cabinet example show that the resulting robust thermally efficient configurations are capable of dissipating up to a 50% greater heat load and a 60% decrease in the temperature variability using the same cooling infrastructure.
APA, Harvard, Vancouver, ISO, and other styles
4

Nalepa, Grzegorz, and Antoni Ligęza. "The HeKatE methodology. Hybrid engineering of intelligent systems." International Journal of Applied Mathematics and Computer Science 20, no. 1 (2010): 35–53. http://dx.doi.org/10.2478/v10006-010-0003-9.

Full text
Abstract:
The HeKatE methodology. Hybrid engineering of intelligent systemsThis paper describes a new approach, the HeKatE methodology, to the design and development of complex rule-based systems for control and decision support. The main paradigm for rule representation, namely, eXtended Tabular Trees (XTT), ensures high density and transparency of visual knowledge representation. Contrary to traditional, flat rule-based systems, the XTT approach is focused on groups of similar rules rather than on single rules. Such groups form decision tables which are connected into a network for inference. Efficient inference is assured as only the rules necessary for achieving the goal, identified by the context of inference and partial order among tables, are fired. In the paper a new version of the language—XTT22—is presented. It is based on ALSV(FD) logic, also described in the paper. Another distinctive feature of the presented approach is a top-down design methodology based on successive refinement of the project. It starts with Attribute Relationship Diagram (ARD) development. Such a diagram represents relationships between system variables. Based on the ARD scheme, XTT tables and links between them are generated. The tables are filled with expert-provided constraints on values of the attributes. The code for rule representation is generated in a humanreadable representation called HMR and interpreted with a provided inference engine called HeaRT. A set of software tools supporting the visual design and development stages is described in brief.
APA, Harvard, Vancouver, ISO, and other styles
5

Han, Juyeoun, Kyungrai Kim, Sangyoon Chin, and Dongwoo Shin. "Decision-making support model for reusable construction materials in multiple project management." Canadian Journal of Civil Engineering 36, no. 2 (2009): 304–19. http://dx.doi.org/10.1139/l08-133.

Full text
Abstract:
Although reusable materials in a construction project need to be specifically managed by the head office of the construction company for proper retrieval and reuse, they are not efficiently managed. In many cases, construction companies treat reusable materials no differently from non-reusable materials. To manage reusable materials for effective and efficient procurement, the current procurement system needs to be improved based on the records of usage from multiple projects monitored and managed by the head office of the construction company. The objective of this paper is to discern and analyze the problems of the procurement process of reusable materials for current multiple projects by following case studies of three construction companies against this background. Based on the analysis, this paper emphasizes the need to implement systems for reusable materials procurement requests, outgoing materials quantity forecasts, and economic analysis for vehicle distribution, as well as the need to expand applicable uses of a resource pool. A model is also proposed to support the decision-making process in the procurement of reusable materials.
APA, Harvard, Vancouver, ISO, and other styles
6

Verstina, Natalia, and Evgeny Evseev. "Reengineering of the management processes of information support of heat supplying organizations of heating systems of a city." MATEC Web of Conferences 193 (2018): 05007. http://dx.doi.org/10.1051/matecconf/201819305007.

Full text
Abstract:
Changes in management of the heat-supplying organizations of the modern city determined by development of technologies in this sphere are considered in the article. Three groups of technologies are allocated, the priority of the auxiliary technologies considered is given. The author points out that there is the need of the information support of management of the heating systems management in the city, which need fundamental reconsideration and radical redesign. The additional argument of the need of reengineering of the heat supplying organizations management is the consideration of the results of their activity, connected with considerable thermal losses. The key provisions of the offered measures for reengineering, based on the creation of processes of information support of management "from scratch" are described. Negative changes of the performance of the heating systems of concrete types were estimated q, and thanks to that the new integrated approach to the choice of methods and technical means of diagnostics of engineering systems and analytical dependences of determination of terms of repair were offered. Obtaining of qualitatively new information through reengineering measures for making managerial decisions is considered. Scenarios of prospects of energy industry, which have been differentiated as the most realistic or connected with the risk of increase in the stubborn problems, minimized by reengineering, were analyzed.
APA, Harvard, Vancouver, ISO, and other styles
7

Dolgui, A., N. Guschinsky, and G. Levin. "A design of DSS for mass production machining systems." Bulletin of the Polish Academy of Sciences: Technical Sciences 57, no. 3 (2009): 265–71. http://dx.doi.org/10.2478/v10175-010-0128-x.

Full text
Abstract:
A design of DSS for mass production machining systemsIn this paper, we present a decision support tool (DSS) for preliminary design of transfer machines with rotary or mobile tables. In these transfer machines, the machining operations are executed on working positions equipped by standard multi-spindle heads. A part is sequentially machined onmworking positions and is moved from one position to the next using a rotary or a mobile table. The operations are grouped into blocks, where the operations of the same block are simultaneously performed by one multi-spindle head. At the preliminary design stage, the goal is to select the number of working positions and to decide which spindle heads will be installed minimizing the machine cost while respecting a given production rate. The paper presents the overall approach and depicts mathematical and decision-support methods developed and implemented in a software for the optimization of preliminary design (or reconfiguration) of such machining systems.
APA, Harvard, Vancouver, ISO, and other styles
8

Nguyen, Dao Thi Anh, Insu Won, Kyusung Kim, and Jangwoo Kwon. "A Development of Clinical Decision Support System for Video Head Impulse Test Based on Fuzzy Inference System." Journal of Sensors 2018 (2018): 1–10. http://dx.doi.org/10.1155/2018/7168524.

Full text
Abstract:
This paper represents the clinical decision support system for video head impulse test (vHIT) based on fuzzy inference system. It examines the eye and head movement recorded by the eye movement tracking device, calculates the vestibulo-ocular reflex (VOR) gain, and applies fuzzy inference system to output the normality and artifact index of the test result. The position VOR gain and the proportion of covert and overt catch-up saccades (CUS) within the dataset are used as the input of the inference system. In addition, this system yields one more factor, the artifact index, which represents the current interference in the dataset. Data of fifteen vestibular neuritis patients and two of normal subjects were evaluated. The artifact index appears to be very high in the lesion side of vestibular neuritis (VN) patients, indicating highly theoretical contradictions, which are low gain but without CUS, or normal gain with the appearance of CUS. Both intact side and normal subject show high normality and low artifact index, even though the intact side has slightly lower normality and higher artifact index. In conclusion, this is a robust system, which is the first one that takes gain and CUS into account, to output not only the normality of the vHIT dataset, but also the artifacts.
APA, Harvard, Vancouver, ISO, and other styles
9

Kuntoğlu, Mustafa, Abdullah Aslan, Danil Yurievich Pimenov, et al. "A Review of Indirect Tool Condition Monitoring Systems and Decision-Making Methods in Turning: Critical Analysis and Trends." Sensors 21, no. 1 (2020): 108. http://dx.doi.org/10.3390/s21010108.

Full text
Abstract:
The complex structure of turning aggravates obtaining the desired results in terms of tool wear and surface roughness. The existence of high temperature and pressure make difficult to reach and observe the cutting area. In-direct tool condition, monitoring systems provide tracking the condition of cutting tool via several released or converted energy types, namely, heat, acoustic emission, vibration, cutting forces and motor current. Tool wear inevitably progresses during metal cutting and has a relationship with these energy types. Indirect tool condition monitoring systems use sensors situated around the cutting area to state the wear condition of the cutting tool without intervention to cutting zone. In this study, sensors mostly used in indirect tool condition monitoring systems and their correlations between tool wear are reviewed to summarize the literature survey in this field for the last two decades. The reviews about tool condition monitoring systems in turning are very limited, and relationship between measured variables such as tool wear and vibration require a detailed analysis. In this work, the main aim is to discuss the effect of sensorial data on tool wear by considering previous published papers. As a computer aided electronic and mechanical support system, tool condition monitoring paves the way for machining industry and the future and development of Industry 4.0.
APA, Harvard, Vancouver, ISO, and other styles
10

Zeynal, Hossein, Zuhaina Zakaria, and Ahmad Kor. "Improved Lagrangian relaxation generation decision-support in presence of electric vehicles." Indonesian Journal of Electrical Engineering and Computer Science 22, no. 1 (2021): 598. http://dx.doi.org/10.11591/ijeecs.v22.i1.pp598-608.

Full text
Abstract:
<p><span>Decision making strategies for resources available in macro/micro scales have long been a critical argument. Among existing methods to address such a mixed-binary optimization model, Lagrangian relaxation (LR) found universal acceptance by many utilities, offering a fast and accurate answer. This paper aims at retrofitting the solution way of LR algorithm by dint of meta-heuristic cuckoo search algorithm (CSA). When integrating CSA into LR mechanism, a tighter duality gap is catered, representing more accurate feasible solution. The key performance of CSA exhibits a head start over other classical methods such as gradient search (GS) and Newton Raphson (NR) when dealt with the relative duality gap closure in LR procedure. Further, electric vehicles (EV) with its associated hard constraints are encompassed into model to imperiling the proposed CSA-LR if encountered with nonlinear fluctuation of duality gap. Simulation results show that the proposed CSA-LR model outperforms the solution quality with/without EV as compared with conventional NR-LR method.</span></p>
APA, Harvard, Vancouver, ISO, and other styles
11

Sirotov, A. V., A. S. Lapin, A. Yu Tesovskiy, F. A. Karchin, and M. S. Usachev. "Supervisory controlled forest machines." FORESTRY BULLETIN 25, no. 4 (2021): 121–28. http://dx.doi.org/10.18698/2542-1468-2021-4-121-128.

Full text
Abstract:
The article presents structural diagrams of an automated control system for technological processes and information flows of logging operations. The article includes subject and research methodology. The study in a field of logging operations confirms the fact of increasing rate of using special forest machinery like forest harvesters for conducting timber-logging operations. One of the main advantages of forest harvesters is the automated functions that allow the control system to optimize the cutting of the tree trunk, taking into account the price of the assortment and its optimal parameters. The majority of forest harvesters in service are equipped with a continuous pull-through harvesting head. It is shown that reducing production and resource losses, preserving the natural environment and adequate measures for the reproduction of wood resources becomes possible due to the transfer of management of forestry processes to a qualitatively new level through the use of modern automated control systems for technological processes and information flows (APCS IP). The development of a software decision support system (DSS) was conducted according to the results of the data process analysis. The automated decision support system provides support for operator activities and a model for making a decision on the appointment and execution of timber-logging operations.
APA, Harvard, Vancouver, ISO, and other styles
12

Zhang, Fan, Chris Bales, and Hasan Fleyeh. "Feature Augmentation of Classifiers Using Learning Time Series Shapelets Transformation for Night Setback Classification of District Heating Substations." Advances in Civil Engineering 2021 (March 5, 2021): 1–12. http://dx.doi.org/10.1155/2021/8887328.

Full text
Abstract:
District heating systems that distribute heat through pipelines to residential and commercial buildings have been widely used in Northern Europe, and according to the latest study, district heating shares the most heat supply market in Sweden. Therefore, energy efficiency of district heating systems is of great interest to energy stakeholders. However, it is not uncommon that district heating systems fail to achieve the expected performance due to various faults or inappropriate operations. Night setback is one control strategy, which has been proved to be not a suitable setting for well-insulated modern buildings in terms of both economic factors and energy efficiency. From the literature, shapelets algorithms not only provide interpretable results but also proved to be effective in time series classification. However, they have not been explored to solve the problem in energy domain. In this study, a feature augmentation approach is proposed based on learning time series shapelets and shapelet transformation, aiming to improve the performance of classifiers for night setback classification. To evaluate the effectiveness of the proposed approach, data of 10 anonymous substations in Sweden are used in the case study. The proposed method is applied to six commonly used baseline classifiers: Support Vector Classifier, Multilayer Perceptron Neural Network, Logistic Regression, K-Nearest Neighbor, Decision Trees, and Random Forest. Precision, recall, and f1 score are used as the performance measures. The results of out-of-sample testing show that it is possible to improve the generalization ability of classifiers by applying the proposed approach. In addition, the highest f1 score of out-of-sample testing is achieved by DT classifier whose f1 score is increased from 0.599 to 0.711 for identifying night setback case and from 0.749 to 0.808 for identifying nonnight setback case using the proposed feature augmentation approach.
APA, Harvard, Vancouver, ISO, and other styles
13

Wilson, W. Douglas, Scott Glenn, Travis Miles, Anthony Knap, and Cesar Toro. "Transformative Ocean Observing for Hurricane Forecasting, Readiness, and Response in the Caribbean Tropical Storm Corridor." Marine Technology Society Journal 55, no. 3 (2021): 90–91. http://dx.doi.org/10.4031/mtsj.55.3.43.

Full text
Abstract:
Abstract The upper ocean in the Western Tropical Atlantic tropical storm corridor—including the Caribbean Sea—is under-sampled and climatologically warming (Figure 1). Regionally varying Essential Ocean Features impacting tropical cyclone dynamics include fresh water upper ocean layers, mesoscale eddies, high Upper Ocean Heat Content values, and inflows from the Subtropical and Equatorial Atlantic. Ongoing research indicates that hurricane intensity forecasts can be improved with expanded and sustained ocean data collection and utilization along the hurricane path.This proposed activity will build supporting physical and social infrastructure and conduct a long-term sampling program in this critical region using gliders, High Frequency Radars (HFR), and developing technologies to provide real-time information resulting in hurricane forecast improvement. Improved forecasts will support new generation of local storm surge/precipitation/wave and coastal impact models and guidance used to directly enhance resilience.The success of this project will depend on the merger of regional scale planning and management AND development of local-level partnerships for implementation. To be sustainable, operational, analytical, and actionable, capability has to exist at the multiple proposed regional system nodes. We will promote expanded education and workforce development using existing partner capabilities, and include an Ocean Observing for SIDS/Developing Economies component. Product and information delivery systems will have local interpretive support and will incorporate local knowledge and expertise.It is our hope that by 2030 our legacy would be successful program to—in the words of the Decade Action Framework—“sustain long-term high-quality observations of marine and coastal environments including human interactions and deliver forecast and decision-support tools.”
APA, Harvard, Vancouver, ISO, and other styles
14

ARORA, VINAY, EDDIE YIN-KWEE NG, ROHAN SINGH LEEKHA, KARUN VERMA, TAKSHI GUPTA, and KATHIRAVAN SRINIVASAN. "HEALTH OF THINGS MODEL FOR CLASSIFYING HUMAN HEART SOUND SIGNALS USING CO-OCCURRENCE MATRIX AND SPECTROGRAM." Journal of Mechanics in Medicine and Biology 20, no. 06 (2020): 2050040. http://dx.doi.org/10.1142/s0219519420500402.

Full text
Abstract:
Cardiovascular diseases have become one of the world’s leading causes of death today. Several decision-making systems have been developed with computer-aided support to help the cardiologists in detecting heart disease and thereby minimizing the mortality rate. This paper uses an unexplored sub-domain related to textural features for classifying phonocardiogram (PCG) as normal or abnormal using Grey Level Co-occurrence Matrix (GLCM). The matrix has been applied to extract features from spectrogram of the PCG signals taken from the Physionet 2016 benchmark dataset. Random Forest, Support Vector Machine, Neural Network, and XGBoost have been applied to assess the status of the human heart using PCG signal spectrogram. The result of GLCM is compared with the two other textural feature extraction methods, viz. structural co-occurrence matrix (SCM), and local binary patterns (LBP). Experimental results have proved that applying machine learning model to classify PCG signal on the dataset where GLCM has extracted the feature-set, the accuracy attained is greater as compared to its peer approaches. Thus, this methodology can go a long way to help the medical specialists in precisely and accurately assessing the heart condition of a patient.
APA, Harvard, Vancouver, ISO, and other styles
15

Yousef, Maria, and Prof Khaled Batiha. "Heart Disease Prediction Model Using Naïve Bayes Algorithm and Machine Learning Techniques." International Journal of Engineering & Technology 10, no. 1 (2021): 46. http://dx.doi.org/10.14419/ijet.v10i1.31310.

Full text
Abstract:
These days, heart disease comes to be one of the major health problems which have affected the lives of people in the whole world. Moreover, death due to heart disease is increasing day by day. So the heart disease prediction systems play an important role in the prevention of heart problems. Where these prediction systems assist doctors in making the right decision to diagnose heart disease easily. The existing prediction systems suffering from the high dimensionality problem of selected features that increase the prediction time and decrease the performance accuracy of the prediction due to many redundant or irrelevant features. Therefore, this paper aims to provide a solution of the dimensionality problem by proposing a new mixed model for heart disease prediction based on (Naïve Bayes method, and machine learning classifiers).In this study, we proposed a new heart disease prediction model (NB-SKDR) based on the Naïve Bayes algorithm (NB) and several machine learning techniques including Support Vector Machine, K-Nearest Neighbors, Decision Tree, and Random Forest. This prediction model consists of three main phases which include: preprocessing, feature selection, and classification. The main objective of this proposed model is to improve the performance of the prediction system and finding the best subset of features. This proposed approach uses the Naïve Bayes technique based on the Bayes theorem to select the best subset of features for the next classification phase, also to handle the high dimensionality problem by avoiding unnecessary features and select only the important ones in an attempt to improve the efficiency and accuracy of classifiers. This method is able to reduce the number of features from 13 to 6 which are (age, gender, blood pressure, fasting blood sugar, cholesterol, exercise induce engine) by determining the dependency between a set of attributes. The dependent attributes are the attributes in which an attribute depends on the other attribute in deciding the value of the class attribute. The dependency between attributes is measured by the conditional probability, which can be easily computed by Bayes theorem. Moreover, in the classification phase, the proposed system uses different classification algorithms such as (DT Decision Tree, RF Random Forest, SVM Support Vector machine, KNN Nearest Neighbors) as a classifiers for predicting whether a patient has heart disease or not. The model is trained and evaluated using the Cleveland Heart Disease database, which contains 13 features and 303 samples.Different algorithms use different rules for producing different representations of knowledge. So, the selection of algorithms to build our model is based on their performance. In this work, we applied and compared several classification algorithms which are (DT, SVM, RF, and KNN) to identify the best-suited algorithm to achieve high accuracy in the prediction of heart disease. After combining the Naive Bayes method with each one of these previous classifiers the performance of these combines algorithms is evaluated by different performance metrics such as (Specificity, Sensitivity, and Accuracy). Where the experimental results show that out of these four classification models, the combination between the Naive Bayes feature selection approach and the SVM RBF classifier can predict heart disease with the highest accuracy of 98%. Finally, the proposed approach is compared with another two systems which developed based on two different approaches in the feature selection step. The first system, based on the Genetic Algorithm (GA) technique, and the second uses the Principal Component Analysis (PCA) technique. Consequently, the comparison proved that the Naive Bayes selection approach of the proposed system is better than the GA and PCA approach in terms of prediction accuracy.
APA, Harvard, Vancouver, ISO, and other styles
16

KÜGLER, PHILIPP. "Online parameter identification in time-dependent differential equations as a non-linear inverse problem." European Journal of Applied Mathematics 19, no. 5 (2008): 479–506. http://dx.doi.org/10.1017/s0956792508007547.

Full text
Abstract:
Online parameter identification in time-dependent differential equations from time course observations related to the physical state can be understood as a non-linear inverse and ill-posed problem and appears in a variety of applications in science and engineering. The feature as well as the challenge of online identification is that sensor data have to be continuously processed during the operation of the real dynamic process in order to support simulation-based decision making. In this paper we present an online parameter identification method that is based on a non-linear parameter-to-output operator and, as opposed to methods available so far, works both for finite- and infinite-dimensional dynamical systems, e.g., both for ordinary differential equations and time-dependent partial differential equations. A further advantage of the method suggested is that it renders typical restrictive assumptions such as full state observability, linear parametrisation of the underlying model and data differentiation or filtering unnecessary. Assuming existence of a solution for exact data, a convergence analysis based on Lyapunov theory is presented. Numerical illustrations given are by means of online identification both of aerodynamic coefficients in a 3DoF-longitudinal aircraft model and of a (distributed) conductivity coefficient in a heat equation.
APA, Harvard, Vancouver, ISO, and other styles
17

Ramana, Lovedeep, Wooram Choi, and Young-Jin Cha. "Fully automated vision-based loosened bolt detection using the Viola–Jones algorithm." Structural Health Monitoring 18, no. 2 (2018): 422–34. http://dx.doi.org/10.1177/1475921718757459.

Full text
Abstract:
Many damage detection methods that use data obtained from contact sensors physically attached to structures have been developed. However, damage-sensitive features such as the modal properties of steel and reinforced concrete are sensitive to environmental conditions such as temperature and humidity. These uncertainties are difficult to address with a regression model or any other temperature compensation method, and these uncertainties are the primary causes of false alarms. A vision-based remote sensing system can be an option for addressing some of the challenges inherent in traditional sensing systems because it provides information about structural conditions. Using bolted connections is a common engineering practice, but very few vision-based techniques have been developed for loosened bolt detection. Thus, this article proposes a fully automated vision-based method for detecting loosened civil structural bolts using the Viola–Jones algorithm and support vector machines. Images of bolt connections for training were taken with a smartphone camera. The Viola–Jones algorithm was trained on two datasets of images with and without bolts to localize all the bolts in the images. The localized bolts were automatically cropped and binarized to calculate the bolt head dimensions and the exposed shank length. The calculated features were fed into a support vector machine to generate a decision boundary separating loosened and tight bolts. We tested our method on images taken with a digital single-lens reflex camera.
APA, Harvard, Vancouver, ISO, and other styles
18

Zhuravska, Nataliia, and Valerii Likhatskyi. "Systematization and Formalization of Passive Monitoring Data in Accordance with the Component-Functional State of Heat Supply Systems." ЕКОНОМІКА І РЕГІОН Науковий вісник, no. 4(79) (December 30, 2020): 62–68. http://dx.doi.org/10.26906/eir.2020.4(79).2188.

Full text
Abstract:
The article examines the assessment of the state of water and hot heat supply systems of heat and power facilities for innovative and constructive use of thermal energy, with effective use in all areas of these systems: generation, for heating water or for generating steam, transporting it to the consumer, as well as when it is consumer so rich industries. Including the housing and communal complex and the construction industry – for example, a scientific and methodological concept has been created for the implementation of passive monitoring to assess the state of heat supply systems, under the conditions of the initial action, on material flows in them of electromagnetic fields. It has been established that an effective form of reagent-free water preparation in electromagnetic fields is technogenic – the nature of material flows due to the electromagnetic dissociation of their micro particles and the formation of active complexes (due to the interaction between them) is determined. It has been established that as a result of passive monitoring, the systematization and formalization of its data should be carried out taking into account three aspects: engineering - technological, engineering – microbiological and organizational – managerial decisions. It is shown that the passive monitoring system is an obligatory subordinate component of environmental management in the industrial sphere – heat and power facilities of water and steam heat supply systems of an innovative direction. The analysis of the content of the point of view of Ukrainian and foreign scientists, monographs is carried out and joint conclusions are made in the conditions of the illuminated literary material under the following headings: the main economic, environmental and regulatory aspects of nature management; the formation of the biosphere in the conditions of transformation of its individual components; methods of environmental management and environmental policy and the like. In these conditions, the formation of an effective economic mechanism for environmental management is of priority; on the central issue of integral management, it requires clarification – formalization of parameters should be considered effective, which make it possible to clarify a compromise between economic development and environmental safety. It is the economic and environmental analysis of the components of technological processes that makes it possible to establish the limiting factors of development in the system (action – state – improvement) and to determine individual patterns (tendencies) in order to overcome them. The theoretical principles of magnetized water are presented in the works, and the priority level is confirmed by the receipt of three patents of Ukraine for useful action and one copyright certificate for intellectual property, and received support at domestic and foreign conferences. The proposed levers improve integral control in the process of modernizing the technology of non-reagent water treatment (use of electromagnetic fields) at heat and power facilities of rich operating industries and make it impossible to process environmental pollution and save energy costs with effective continuous planning of new facilities. Therefore, through the analysis of the general environmental situation, the technical state of the initial data systems, it is possible to confirm the developed classification and formalization of passive monitoring data, on the use of theoretical foundations in practice – the regulations for industrial implementation, is the basis of the concept of sustainable development.
APA, Harvard, Vancouver, ISO, and other styles
19

Prasetyo, Wibisono Adhi, and Gatot Yudoko. "BUSINESS PROCESS RE-ENGINEERING THROUGH 3 HARD Ss McKINSEY FRAMEWORK AS WORKING CAPITAL MANAGEMENT IMPROVEMENT PROPOSAL (CASE STUDY: PT. XYZ – 2020/2021)." Jurnal Pertahanan: Media Informasi ttg Kajian & Strategi Pertahanan yang Mengedepankan Identity, Nasionalism & Integrity 7, no. 1 (2021): 1. http://dx.doi.org/10.33172/jp.v7i1.1144.

Full text
Abstract:
<div><p class="Els-history-head">Due to the negative operating cash flow, working capital's unsatisfactory performance indicates that internal business processes exist. Meanwhile, the COVID-19 pandemic urges the management to be wiser in making decisions, primarily to support better working capital management. They cannot rely forever on its aggressive capital structure. Accordingly, the management committed to carrying out operation improvement related to working capital management. This decision is the background of this research to identify and analyze the root cause problems and improvement plan issues. Further investigations regarding the business issue are conducted, and business process re-engineering through the 3 hard Ss framework is selected as the research design. These frameworks are approached due to their capability to map the thorough process and link them to one another. These combination improvement ideas over the ineffective and inefficient activities could be blended while organized to be aligned with the organizational directions. This research is conducted through a semi-structured interview to explore the company conditions and uses secondary data to gather the information and strengthen the evidence. At least 14 ineffective and inefficient activities were then carried out a root-cause analysis to determine the source of the problem that had to be eliminated. Based on these findings, four management improvements are suggested: Production management, Receivable Management, Payable management, and Cash flow control and monitoring translated into action plan strategies, procedures, and tools which are equipped structurally and systems.</p></div>
APA, Harvard, Vancouver, ISO, and other styles
20

Maharana, Chapala, Bijan Bihari Mishra, and Ch Sanjeev Kumar Dash. "A Topical Survey: Applications of Machine Learning in Medical Issues." Journal of Computational and Theoretical Nanoscience 17, no. 11 (2020): 5010–19. http://dx.doi.org/10.1166/jctn.2020.9334.

Full text
Abstract:
Computational Intelligence methods have replaced almost all real world applications with high accuracy within the given time period. Machine Learning approaches like classification, feature selection, feature extraction have solved many problems of different domain. They use different ML models implemented with suitable ML tool or combination of tools from NN (Neural Network), SVM (Support Vector Machine), DL (Deep Learning), ELM (Extreme Learning Machine). The model is used for training with known data along with ML algorithms (fuzzy logic, genetic algorithm) to optimize the accuracy for different medical issues for example gene expression and image segmentation for information extraction and disease diagnosis, health monitoring, disease treatment. Most of the medical problems are solved using recent advances in AI (Artificial Intelligence) technologies with the biomedical systems development (e.g., Knowledge based Decision Support Systems) and AI technologies with medical informatics science. AI based methods like machine learning algorithms implemented models are increasingly found in real life applications ex. healthcare, natural calamity detection and forecasting. There are the expert systems handled by experts for knowledge gain which is used in decision making applications. The ML models are found in different medical applications like disease diagnosis (ex. cancer prediction, diabetics disease prediction) and for treatment of diseases (ex. in diabetics disease the reduction in mean glucose concentration following intermittent gastric feeds). The feature selection ML method is used for EEG classification for detection of the severity of the disease in heart related diseases and for identification of genes in different disorder like autism disorder. The ML models are found in health record systems. There are other applications of ML approaches found in image segmentation, tissue extraction, image fragmentation for disease diagnosis (ex. lesion detection in breast cancer for malignancy) and then treatment of those diseases. ML models are found in mobile health treatment, treatment of psychology patients, treatment of dumb patients etc. Medical data handling is the vital part of health care systems for the development of AI systems which can again be solved by machine learning approaches. The ML approaches for medical issues have used ensemble methods or combinations of machine learning tools and machine learning algorithms to optimize the result with good accuracy value at a faster rate.
APA, Harvard, Vancouver, ISO, and other styles
21

Porto, Roberto, José M. Molina, Antonio Berlanga, and Miguel A. Patricio. "Minimum Relevant Features to Obtain Explainable Systems for Predicting Cardiovascular Disease Using the Statlog Data Set." Applied Sciences 11, no. 3 (2021): 1285. http://dx.doi.org/10.3390/app11031285.

Full text
Abstract:
Learning systems have been focused on creating models capable of obtaining the best results in error metrics. Recently, the focus has shifted to improvement in the interpretation and explanation of the results. The need for interpretation is greater when these models are used to support decision making. In some areas, this becomes an indispensable requirement, such as in medicine. The goal of this study was to define a simple process to construct a system that could be easily interpreted based on two principles: (1) reduction of attributes without degrading the performance of the prediction systems and (2) selecting a technique to interpret the final prediction system. To describe this process, we selected a problem, predicting cardiovascular disease, by analyzing the well-known Statlog (Heart) data set from the University of California’s Automated Learning Repository. We analyzed the cost of making predictions easier to interpret by reducing the number of features that explain the classification of health status versus the cost in accuracy. We performed an analysis on a large set of classification techniques and performance metrics, demonstrating that it is possible to construct explainable and reliable models that provide high quality predictive performance.
APA, Harvard, Vancouver, ISO, and other styles
22

Soltani, Ali, and Ehsan Sharifi. "Understanding and Analysing the Urban Heat Island (UHI) Effect in Micro-Scale." International Journal of Social Ecology and Sustainable Development 10, no. 2 (2019): 14–28. http://dx.doi.org/10.4018/ijsesd.2019040102.

Full text
Abstract:
The shortage of vegetation cover alongside urban structures and land hardscape in cities causes an artificial temperature increase in urban environments known as the urban heat island (UHI) effect. The artificial heat stress in cities has a particular threat for usability and health-safety of outdoor living in public space. Australia may face a likely 3.8°C increase in surface temperature by 2090. Such an increase in temperature will have a severe impact on regional and local climate systems, natural ecosystems, and human life in cities. This paper aims to determine the patterns of the UHI effect in micro-scale of Adelaide metropolitan area, South Australia. The urban near-surface temperature profile of Adelaide was measured along a linear east-west cross-section of the metropolitan area via mobile traverse method between 26 July 2013 and 15 August 2013. Results indicate that the while the maximum UHI effect occurs at midnight in the central business district (CBD) area in Adelaide, the afternoon urban warmth has more temperature variations (point-to-point variation), especially during the late afternoon when local air temperature is normally in its peak. Thus, critical measurement of heat-health consequences of the UHI effect need to be focused on the afternoon heat stress conditions in UHIs rather than the commonly known night time phenomenon. This mobile traverse urban heat study of Adelaide supports the hypothesis that the UHI effect varies in the built environment during daily cycles and within short distances. Classical UHI measurements are commonly performed during the night – when the urban-rural temperature differences are at their maximum. Thus, they fall short in addressing the issue of excess heat stress on human participants. However, having thermally comfortable urban microclimates is a fundamental characteristic of healthy and vibrant public spaces. Therefore, urban planning professionals and decision makers are required to consider diurnal heat stress alongside nocturnal urban heat islands in planning healthy cities. The results of this article show that the diurnal heat stress varies in the built environment during daily cycles and within short distances. This study confirms that the maximum urban heat stress occurs during late afternoon when both overall temperature and daily urban warmth are at their peak. Literature indicates that diurnal heat stress peaks in hard-landscapes urban settings while it may decrease in urban parklands and near water bodies. Therefore, urban greenery and surface water can assist achieving more liveable and healthy urban environments (generalisation requires further research). A better understanding of daily urban warmth variations in cities assists urban policy making and public life management in the context of climate change.
APA, Harvard, Vancouver, ISO, and other styles
23

Narayanaswamy, Vedachalam, Ramesh Raju, Muthukumaran Durairaj, et al. "Reliability-Centered Development of Deep Water ROV ROSUB 6000." Marine Technology Society Journal 47, no. 3 (2013): 55–71. http://dx.doi.org/10.4031/mtsj.47.3.3.

Full text
Abstract:
AbstractThis paper presents the reliability-centered development of a deep water Remotely Operable Vehicle (ROV) ROSUB 6000 by the National Institute of Ocean Technology (NIOT), India. ROV operations are required during deep water interventions, such as well head operations, emergency response situations, bathymetric surveys, gas hydrate surveys, poly-metallic nodule exploration, and salvage operations. As per our requirements, the system needs to be capable of deep water operation for a period of 300 h/year and to be extremely reliable. Methodologies applied during the development and enhancement phases of electrical and control systems, taking into consideration the cost, space, and time constraints to attain the best possible reliability are detailed. Reliability, availability, and maintainability (RAM) studies are carried out to identify possible failure cases. It is found that ROV-Tether Management System (TMS) docking failure could be detrimental to the ROV system, and manipulator system failure could be detrimental to subsea operations. It has been calculated and found that the improved design has a mean time between failure (MTBF) of 4.9 and 6.2 years for ROV-TMS docking and manipulator system operations, respectively. The importance of monitoring tether cable healthiness during normal and winding operations and the systems implemented for effectively monitoring and maintaining the tether cable operational and functional healthiness, using the tether cable pay-out, vehicle heading, electric insulation, and optical performance sensors with the aid of a sea battery, are discussed. Maintenance decision support tables, which detail the operational personnel for the upkeep of the systems during the indicated interval so that the highest possible reliability is maintained during the mission period, are also presented.
APA, Harvard, Vancouver, ISO, and other styles
24

Carpenter, Chris. "Dynamic Simulation in Deep Water Enhances Operations From Design to Production." Journal of Petroleum Technology 73, no. 05 (2021): 47–48. http://dx.doi.org/10.2118/0521-0047-jpt.

Full text
Abstract:
This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper OTC 30838, “Shell Appomattox Model-Based Operations From Design to Production: A Game Changer in Gulf of Mexico Deepwater Operation,” by Robert Tulalian, Shell, and Evan Keever and Ankur Rastogi, Kongsberg, prepared for the 2020 Offshore Technology Conference, originally scheduled to be held in Houston, 4–7 May. The paper has not been peer reviewed. Copyright 2020 Offshore Technology Conference. Reproduced by permission. The complete paper discusses how large operations such as Appomattox in the Gulf of Mexico’s deepwater Norphlet formation can use an integrated dynamic simulation-based solution throughout the project life cycle to aid in design verification, operator training, startup support, and real-time surveillance. The authors write that their recommendations and findings can be applied to similar project implementation efforts elsewhere in the industry. Introduction The Appomattox development spans Mississippi Canyon Blocks 348, 391, 392, and 393. Peak production rates are estimated to be approximately 175,000 BOE/D, with water injection planned for the future to support reservoir pressures. Appomattox includes a combined cycle steam system, using process waste heat to generate steam. This steam can be used to drive a generator, providing extra power for the facility. The Appomattox facility can be seen in Fig. 1. A multipurpose dynamic simulator (MPDS) was developed to address the inherent complexities of the Appomattox system, providing a high-fidelity integrated model that simulates both top-sides and subsea process conditions. This model was integrated with the Appomattox control system and deployed in a setup to mimic the offshore control room, creating a realistic training environment for operators. The MPDS was completed over 1 year before first oil, providing ample time for operator training and other use cases such as distributed-control-system (DCS) checkout and engineering studies. Because of the success of the MPDS, the operator applied the existing Appomattox model to the operation phase through the creation of a real-time surveillance system (RTS). Connecting the process model to the facility’s historian by open-platform communications (OPC) enables the RTS to serve as a virtual copy of the live facility, mimicking process conditions in real time. This enables the RTS to serve as a platform for useful surveillance applications such as virtual flow metering, blockage detection, and equipment-performance monitoring. Process Model Development Once the decision to build an MPDS was made, the project team determined which systems would be included in the scope of the model as well as what data would be used for input and validation. Because the MPDS would be used for both engineering and operations use, most systems were included in the scope and modeled at high fidelity to maximize potential benefits.
APA, Harvard, Vancouver, ISO, and other styles
25

Reddy, Ravi, Navid Resalat, Leah M. Wilson, Jessica R. Castle, Joseph El Youssef, and Peter G. Jacobs. "Prediction of Hypoglycemia During Aerobic Exercise in Adults With Type 1 Diabetes." Journal of Diabetes Science and Technology 13, no. 5 (2019): 919–27. http://dx.doi.org/10.1177/1932296818823792.

Full text
Abstract:
Background: Fear of exercise related hypoglycemia is a major reason why people with type 1 diabetes (T1D) do not exercise. There is no validated prediction algorithm that can predict hypoglycemia at the start of aerobic exercise. Methods: We have developed and evaluated two separate algorithms to predict hypoglycemia at the start of exercise. Model 1 is a decision tree and model 2 is a random forest model. Both models were trained using a meta-data set based on 154 observations of in-clinic aerobic exercise in 43 adults with T1D from 3 different studies that included participants using sensor augmented pump therapy, automated insulin delivery therapy, and automated insulin and glucagon therapy. Both models were validated using an entirely new validation data set with 90 exercise observations collected from 12 new adults with T1D. Results: Model 1 identified two critical features predictive of hypoglycemia during exercise: heart rate and glucose at the start of exercise. If heart rate was greater than 121 bpm during the first 5 min of exercise and glucose at the start of exercise was less than 182 mg/dL, it predicted hypoglycemia with 79.55% accuracy. Model 2 achieved a higher accuracy of 86.7% using additional features and higher complexity. Conclusions: Models presented here can assist people with T1D to avoid exercise related hypoglycemia. The simple model 1 heuristic can be easily remembered (the 180/120 rule) and model 2 is more complex requiring computational resources, making it suitable for automated artificial pancreas or decision support systems.
APA, Harvard, Vancouver, ISO, and other styles
26

Hasan, Omar Shakir, and Ibrahim Ahmed Saleh. "Development of heart attack prediction model based on ensemble learning." Eastern-European Journal of Enterprise Technologies 4, no. 2(112) (2021): 26–34. http://dx.doi.org/10.15587/1729-4061.2021.238528.

Full text
Abstract:
With the advent of the data age, the continuous improvement and widespread application of medical information systems have led to an exponential growth of biomedical data, such as medical imaging, electronic medical records, biometric tags, and clinical records that have potential and essential research value. However, medical research based on statistical methods is limited by the class and size of the research community, so it cannot effectively perform data mining for large-scale medical information. At the same time, supervised machine learning techniques can effectively solve this problem. Heart attack is one of the most common diseases and one of the leading causes of death, so finding a system that can accurately and reliably predict early diagnosis is an essential and influential step in treating such diseases. Researchers have used various data mining and machine learning techniques to analyze medical data, helping professionals predict heart disease. This paper presents various features related to heart disease, and the model is based on ensemble learning. The proposed system involves preprocessing data, selecting attributes, and then using logistic regression algorithms as meta-classifiers to build the ensemble learning model. Furthermore, using machine learning algorithms (Support Vector Machines, Decision Tree, Random Forest, Extreme Gradient Boosting) for prediction on the Framingham Heart Study dataset and compared with the proposed methodology. The results show that the feasibility and effectiveness of the proposed prediction method based on group learning provide accuracy for medical recommendations and better accuracy than the single traditional machine learning algorithm.
APA, Harvard, Vancouver, ISO, and other styles
27

Jacob, Varghese S., and Hasan Pirkul. "Organizational decision support systems." International Journal of Man-Machine Studies 36, no. 6 (1992): 817–32. http://dx.doi.org/10.1016/0020-7373(92)90074-u.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Tuan Le, Minh, Minh Thanh Vo, Nhat Tan Pham, and Son V.T Dao. "Predicting heart failure using a wrapper-based feature selection." Indonesian Journal of Electrical Engineering and Computer Science 21, no. 3 (2021): 1530. http://dx.doi.org/10.11591/ijeecs.v21.i3.pp1530-1539.

Full text
Abstract:
In the current health system, it is very difficult for medical practitioners/physicians to diagnose the effectiveness of heart contraction. In this research, we proposed a machine learning model to predict heart contraction using an artificial neural network (ANN). We also proposed a novel wrapper-based feature selection utilizing a grey wolf optimization (GWO) to reduce the number of required input attributes. In this work, we compared the results achieved using our method and several conventional machine learning algorithms approaches such as support vector machine, decision tree, K-nearest neighbor, naïve bayes, random forest, and logistic regression. Computational results show not only that much fewer features are needed, but also higher prediction accuracy can be achieved around 87%. This work has the potential to be applicable to clinical practice and become a supporting tool for doctors/physicians.
APA, Harvard, Vancouver, ISO, and other styles
29

Hersh, M. A. "Sustainable decision making and decision support systems." Computing & Control Engineering Journal 9, no. 6 (1998): 289–95. http://dx.doi.org/10.1049/cce:19980610.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Remy, C., B. Lesjean, and J. Waschnewski. "Identifying energy and carbon footprint optimization potentials of a sludge treatment line with Life Cycle Assessment." Water Science and Technology 67, no. 1 (2013): 63–73. http://dx.doi.org/10.2166/wst.2012.529.

Full text
Abstract:
This study exemplifies the use of Life Cycle Assessment (LCA) as a tool to quantify the environmental impacts of processes for wastewater treatment. In a case study, the sludge treatment line of a large wastewater treatment plant (WWTP) is analysed in terms of cumulative energy demand and the emission of greenhouse gases (carbon footprint). Sludge treatment consists of anaerobic digestion, dewatering, drying, and disposal of stabilized sludge in mono- or co-incineration in power plants or cement kilns. All relevant forms of energy demand (electricity, heat, chemicals, fossil fuels, transport) and greenhouse gas emissions (fossil CO2, CH4, N2O) are accounted in the assessment, including the treatment of return liquor from dewatering in the WWTP. Results show that the existing process is positive in energy balance (–162 MJ/PECOD * a) and carbon footprint (–11.6 kg CO2-eq/PECOD * a) by supplying secondary products such as electricity from biogas production or mono-incineration and substituting fossil fuels in co-incineration. However, disposal routes for stabilized sludge differ considerably in their energy and greenhouse gas profiles. In total, LCA proves to be a suitable tool to support future investment decisions with information of environmental relevance on the impact of wastewater treatment, but also urban water systems in general.
APA, Harvard, Vancouver, ISO, and other styles
31

Cassie, Claire. "Marketing decision support systems." Industrial Management & Data Systems 97, no. 8 (1997): 293–96. http://dx.doi.org/10.1108/02635579710195000.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Ba, Sulin, and Andrew B. Whinston. "Challenges for Decision Support Systems." IEEJ Transactions on Electronics, Information and Systems 114, no. 3 (1994): 295–309. http://dx.doi.org/10.1541/ieejeiss1987.114.3_295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Meystel, A. M. "Multiresolutional hierarchical decision support systems." IEEE Transactions on Systems, Man and Cybernetics, Part C (Applications and Reviews) 33, no. 1 (2003): 86–101. http://dx.doi.org/10.1109/tsmcc.2003.809866.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Alwateer, Majed, Abdulqader M. Almars, Kareem N. Areed, Mostafa A. Elhosseini, Amira Y. Haikal, and Mahmoud Badawy. "Ambient Healthcare Approach with Hybrid Whale Optimization Algorithm and Naïve Bayes Classifier." Sensors 21, no. 13 (2021): 4579. http://dx.doi.org/10.3390/s21134579.

Full text
Abstract:
There is a crucial need to process patient’s data immediately to make a sound decision rapidly; this data has a very large size and excessive features. Recently, many cloud-based IoT healthcare systems are proposed in the literature. However, there are still several challenges associated with the processing time and overall system efficiency concerning big healthcare data. This paper introduces a novel approach for processing healthcare data and predicts useful information with the support of the use of minimum computational cost. The main objective is to accept several types of data and improve accuracy and reduce the processing time. The proposed approach uses a hybrid algorithm which will consist of two phases. The first phase aims to minimize the number of features for big data by using the Whale Optimization Algorithm as a feature selection technique. After that, the second phase performs real-time data classification by using Naïve Bayes Classifier. The proposed approach is based on fog Computing for better business agility, better security, deeper insights with privacy, and reduced operation cost. The experimental results demonstrate that the proposed approach can reduce the number of datasets features, improve the accuracy and reduce the processing time. Accuracy enhanced by average rate: 3.6% (3.34 for Diabetes, 2.94 for Heart disease, 3.77 for Heart attack prediction, and 4.15 for Sonar). Besides, it enhances the processing speed by reducing the processing time by an average rate: 8.7% (28.96 for Diabetes, 1.07 for Heart disease, 3.31 for Heart attack prediction, and 1.4 for Sonar).
APA, Harvard, Vancouver, ISO, and other styles
35

Arias, Ana, Patricia Mores, Nicolás Scenna, José Caballero, Sergio Mussati, and Miguel Mussati. "Optimal Design of a Two-Stage Membrane System for Hydrogen Separation in Refining Processes." Processes 6, no. 11 (2018): 208. http://dx.doi.org/10.3390/pr6110208.

Full text
Abstract:
This paper fits into the process system engineering field by addressing the optimization of a two-stage membrane system for H2 separation in refinery processes. To this end, a nonlinear mathematical programming (NLP) model is developed to simultaneously optimize the size of each membrane stage (membrane area, heat transfer area, and installed power for compressors and vacuum pumps) and operating conditions (flow rates, pressures, temperatures, and compositions) to achieve desired target levels of H2 product purity and H2 recovery at a minimum total annual cost. Optimal configuration and process design are obtained from a model which embeds different operating modes and process configurations. For instance, the following candidate ways to create the driving force across the membrane are embedded: (a) compression of both feed and/or permeate streams, or (b) vacuum application in permeate streams, or (c) a combination of (a) and (b). In addition, the potential selection of an expansion turbine to recover energy from the retentate stream (energy recovery system) is also embedded. For a H2 product purity of 0.90 and H2 recovery of 90%, a minimum total annual cost of 1.764 M$·year−1 was obtained for treating 100 kmol·h−1 with 0.18, 0.16, 0.62, and 0.04 mole fraction of H2, CO, N2, CO2, respectively. The optimal solution selected a combination of compression and vacuum to create the driving force and removed the expansion turbine. Afterwards, this optimal solution was compared in terms of costs, process-unit sizes, and operating conditions to the following two sub-optimal solutions: (i) no vacuum in permeate stream is applied, and (ii) the expansion turbine is included into the process. The comparison showed that the latter (ii) has the highest total annual cost (TAC) value, which is around 7% higher than the former (i) and 24% higher than the found optimal solution. Finally, a sensitivity analysis to investigate the influence of the desired H2 product purity and H2 recovery is presented. Opposite cost-based trade-offs between total membrane area and total electric power were observed with the variations of these two model parameters. This paper contributes a valuable decision-support tool in the process system engineering field for designing, simulating, and optimizing membrane-based systems for H2 separation in a particular industrial case; and the presented optimization results provide useful guidelines to assist in selecting the optimal configuration and operating mode.
APA, Harvard, Vancouver, ISO, and other styles
36

Montgomerie, G. A. "Executive Information Systems and Decision Support." Engineering Management Journal 3, no. 3 (1993): 134. http://dx.doi.org/10.1049/em:19930038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Löf, Staffan, and Björn Möller. "Knowledge systems and management decision support." Expert Systems with Applications 3, no. 2 (1991): 187–94. http://dx.doi.org/10.1016/0957-4174(91)90147-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Sebastian, Patrick, and Yann Ledoux. "Decision support systems in preliminary design." International Journal on Interactive Design and Manufacturing (IJIDeM) 3, no. 4 (2009): 223–26. http://dx.doi.org/10.1007/s12008-009-0077-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Rodríguez-Padial, N., M. Marín, and R. Domingo. "Strategic Framework to Maintenance Decision Support Systems." Procedia Engineering 132 (2015): 903–10. http://dx.doi.org/10.1016/j.proeng.2015.12.576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Riedel, Sharon L., and Gordon F. Pitz. "Utilization-Oriented Evaluation of Decision Support Systems." IEEE Transactions on Systems, Man, and Cybernetics 16, no. 6 (1986): 980–96. http://dx.doi.org/10.1109/tsmc.1986.4309016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Übeyli, Elif Derya. "Advances in medical decision support systems." Expert Systems 26, no. 1 (2009): 3–7. http://dx.doi.org/10.1111/j.1468-0394.2008.00503.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Wu, Desheng Dash, and Jon G. Hall. "Special issue: business decision support systems." Expert Systems 28, no. 3 (2011): 197–98. http://dx.doi.org/10.1111/j.1468-0394.2011.00604.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Marcos, Maria P., José Luis Pitarch, and César de Prada. "Decision support system for a heat-recovery section with equipment degradation." Decision Support Systems 137 (October 2020): 113380. http://dx.doi.org/10.1016/j.dss.2020.113380.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Antoniou, Evangelos, Pavlos Bozios, Vasileios Christou, et al. "EEG-Based Eye Movement Recognition Using Brain–Computer Interface and Random Forests." Sensors 21, no. 7 (2021): 2339. http://dx.doi.org/10.3390/s21072339.

Full text
Abstract:
Discrimination of eye movements and visual states is a flourishing field of research and there is an urgent need for non-manual EEG-based wheelchair control and navigation systems. This paper presents a novel system that utilizes a brain–computer interface (BCI) to capture electroencephalographic (EEG) signals from human subjects while eye movement and subsequently classify them into six categories by applying a random forests (RF) classification algorithm. RF is an ensemble learning method that constructs a series of decision trees where each tree gives a class prediction, and the class with the highest number of class predictions becomes the model’s prediction. The categories of the proposed random forests brain–computer interface (RF-BCI) are defined according to the position of the subject’s eyes: open, closed, left, right, up, and down. The purpose of RF-BCI is to be utilized as an EEG-based control system for driving an electromechanical wheelchair (rehabilitation device). The proposed approach has been tested using a dataset containing 219 records taken from 10 different patients. The BCI implemented the EPOC Flex head cap system, which includes 32 saline felt sensors for capturing the subjects’ EEG signals. Each sensor caught four different brain waves (delta, theta, alpha, and beta) per second. Then, these signals were split in 4-second windows resulting in 512 samples per record and the band energy was extracted for each EEG rhythm. The proposed system was compared with naïve Bayes, Bayes Network, k-nearest neighbors (K-NN), multilayer perceptron (MLP), support vector machine (SVM), J48-C4.5 decision tree, and Bagging classification algorithms. The experimental results showed that the RF algorithm outperformed compared to the other approaches and high levels of accuracy (85.39%) for a 6-class classification are obtained. This method exploits high spatial information acquired from the Emotiv EPOC Flex wearable EEG recording device and examines successfully the potential of this device to be used for BCI wheelchair technology.
APA, Harvard, Vancouver, ISO, and other styles
45

Kabachinski, Jeff. "A Look at Clinical Decision Support Systems." Biomedical Instrumentation & Technology 47, no. 5 (2013): 432–34. http://dx.doi.org/10.2345/0899-8205-47.5.432.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Seeberg, T. M., A. B. Vardy, M. M. V. Taklo, and H. O. Austad. "Decision Support for Subjects Exposed to Heat Stress." IEEE Journal of Biomedical and Health Informatics 17, no. 2 (2013): 402–10. http://dx.doi.org/10.1109/jbhi.2013.2245141.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Bohner, Christiane. "Decision-support systems for sustainable urban planning." International Journal of Environmental Technology and Management 6, no. 1/2 (2006): 193. http://dx.doi.org/10.1504/ijetm.2006.008261.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Yu, Shi Dong, Hang Li, and Chen Wang. "Information Visualization for Decision Support Systems on Commerce." Advanced Materials Research 267 (June 2011): 13–18. http://dx.doi.org/10.4028/www.scientific.net/amr.267.13.

Full text
Abstract:
Many users benefit from decision support systems (DSS), but sometimes they can’t readily comprehend the nature or meaning of the outcome from DSS. In general, interpretation of data is much more intuitive if the results from the DSS are translated into charts, maps, and other graphical displays because visualization exploits our natural ability to recognize and understand visual patterns. In this paper we discuss the concept of visualization user interface (VUI) for decision support systems on commerce (CDSS). An information visualization model for CDSS is proposed, which consists of three elements. In addition, a visualized information retrieval engine based on fuzzy control is proposed.
APA, Harvard, Vancouver, ISO, and other styles
49

Sun, Xiaoqian, Volker Gollnick, Yongchang Li, and Eike Stumpf. "Intelligent Multicriteria Decision Support System for Systems Design." Journal of Aircraft 51, no. 1 (2014): 216–25. http://dx.doi.org/10.2514/1.c032296.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Economou, G. P. K., N. M. Economopoulos, D. Lymberopoulos, and C. E. Goutis. "Experiences accumulated working towards medical decision support systems." Microprocessing and Microprogramming 40, no. 10-12 (1994): 883–86. http://dx.doi.org/10.1016/0165-6074(94)90061-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!