Academic literature on the topic 'Weighted adaptive min-max normalization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Weighted adaptive min-max normalization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Weighted adaptive min-max normalization"

1

Kalluri, Venkata Saiteja, Sai Chakravarthy Malineni, Manjula Seenivasan, Jeevitha Sakkarai, Deepak Kumar, and Bhuvanesh Ananthan. "Enhancing manufacturing efficiency: leveraging CRM data with Lean-based DL approach for early failure detection." Bulletin of Electrical Engineering and Informatics 14, no. 3 (2025): 2319–29. https://doi.org/10.11591/eei.v14i3.8757.

Full text
Abstract:
In the pursuit of enhancing manufacturing competitiveness in India, companies are exploring innovative strategies to streamline operations and ensure product quality. Embracing Lean principles has become a focal point for many, aiming to optimize profitability while minimizing waste. As part of this endeavour, researchers have introduced various methodologies grounded in Lean principles to track and mitigate operational inefficiencies. This paper introduces a novel approach leveraging deep learning (DL) techniques to detect early failures in manufacturing systems. Initially, realtime data is c
APA, Harvard, Vancouver, ISO, and other styles
2

Prasetyowati, Sri Arttini Dwi, Munaf Ismail, and Badieah Badieah. "Implementation of Least Mean Square Adaptive Algorithm on Covid-19 Prediction." JUITA: Jurnal Informatika 10, no. 1 (2022): 139. http://dx.doi.org/10.30595/juita.v10i1.11963.

Full text
Abstract:
This study used Corona Virus Disease-19 (Covid-19) data in Indonesia from June to August 2021, consisting of data on people who were infected or positive Covid-19, recovered from Covid-19, and passed away from Covid-19. The data were processed using the adaptive LMS algorithm directly without pre-processing cause calculation errors, because covid-19 data was not balanced. Z-score and min-max normalization were chosen as pre-processing methods. After that, the prediction process can be carried out using the LMS adaptive method. The analysis was done by observing the error prediction that occurr
APA, Harvard, Vancouver, ISO, and other styles
3

Rodríguez, Carlos Gervasio, María Isabel Lamas, Juan de Dios Rodríguez, and Claudio Caccia. "ANALYSIS OF THE PRE-INJECTION CONFIGURATION IN A MARINE ENGINE THROUGH SEVERAL MCDM TECHNIQUES." Brodogradnja 72, no. 4 (2021): 1–17. http://dx.doi.org/10.21278/brod72401.

Full text
Abstract:
The present manuscript describes a computational model employed to characterize the performance and emissions of a commercial marine diesel engine. This model analyzes several pre-injection parameters, such as starting instant, quantity, and duration. The goal is to reduce nitrogen oxides (NOx), as well as its effect on emissions and consumption. Since some of the parameters considered have opposite effects on the results, the present work proposes a MCDM (Multiple-Criteria Decision Making) methodology to determine the most adequate pre-injection configuration. An important issue in MCDM model
APA, Harvard, Vancouver, ISO, and other styles
4

Himsar, Himsar. "Payment System Liquidity Index." Talenta Conference Series: Energy and Engineering (EE) 1, no. 2 (2018): 196–210. http://dx.doi.org/10.32734/ee.v1i2.250.

Full text
Abstract:
ISSP is an index that demonstrates payment system’s stability figuring its liquidity (ISLSP) and its operational capability (IOSP). It was formed using two methods, which are statistical normalization and conversion using empirical normalization Min-Max. Basically, this paper intends to evaluate towards variables used in forming ISLSP and basically as a tool to ensure data sensitivity to important events stated. To get ISLSP that is sensitive to RTGS liquidity condition, we use coefficient from each weighted variable through simultaneous regression. We get parameters simbolized , and that are
APA, Harvard, Vancouver, ISO, and other styles
5

Shantal, Mohammed, Zalinda Othman, and Azuraliza Abu Bakar. "A Novel Approach for Data Feature Weighting Using Correlation Coefficients and Min–Max Normalization." Symmetry 15, no. 12 (2023): 2185. http://dx.doi.org/10.3390/sym15122185.

Full text
Abstract:
In the realm of data analysis and machine learning, achieving an optimal balance of feature importance, known as feature weighting, plays a pivotal role, especially when considering the nuanced interplay between the symmetry of data distribution and the need to assign differential weights to individual features. Also, avoiding the dominance of large-scale traits is essential in data preparation. This step makes choosing an effective normalization approach one of the most challenging aspects of machine learning. In addition to normalization, feature weighting is another strategy to deal with th
APA, Harvard, Vancouver, ISO, and other styles
6

HAFS, Toufik, Hatem ZEHIR, and Ali HAFS. "Enhancing Recognition in Multimodal Biometric Systems: Score Normalization and Fusion of Online Signatures and Fingerprints." Romanian Journal of Information Science and Technology 2024, no. 1 (2024): 37–49. http://dx.doi.org/10.59277/romjist.2024.1.03.

Full text
Abstract:
Multimodal biometrics employs multiple modalities within a single system to address the limitations of unimodal systems, such as incomplete data acquisition or deliberate fraud, while enhancing recognition accuracy. This study explores score normalization and its impact on system performance. To fuse scores effectively, prior normalization is necessary, followed by a weighted sum fusion technique that aligns impostor and genuine scores within a common range. Experiments conducted on three biometric databases demonstrate the promising efficacy of the proposed approach, particularly when combine
APA, Harvard, Vancouver, ISO, and other styles
7

Nayak, Dillip Ranjan, Neelamadhab Padhy, Pradeep Kumar Mallick, Mikhail Zymbler, and Sachin Kumar. "Brain Tumor Classification Using Dense Efficient-Net." Axioms 11, no. 1 (2022): 34. http://dx.doi.org/10.3390/axioms11010034.

Full text
Abstract:
Brain tumors are most common in children and the elderly. It is a serious form of cancer caused by uncontrollable brain cell growth inside the skull. Tumor cells are notoriously difficult to classify due to their heterogeneity. Convolutional neural networks (CNNs) are the most widely used machine learning algorithm for visual learning and brain tumor recognition. This study proposed a CNN-based dense EfficientNet using min-max normalization to classify 3260 T1-weighted contrast-enhanced brain magnetic resonance images into four categories (glioma, meningioma, pituitary, and no tumor). The deve
APA, Harvard, Vancouver, ISO, and other styles
8

Patanavijit, Vorapoj. "Denoising performance analysis of adaptive decision based inverse distance weighted interpolation (DBIDWI) algorithm for salt and pepper noise." Indonesian Journal of Electrical Engineering and Computer Science 15, no. 2 (2019): 804. http://dx.doi.org/10.11591/ijeecs.v15.i2.pp804-813.

Full text
Abstract:
<p>Due to its superior performance for denoising an image, which is contaminated by impulsive noise, an adaptive decision based inverse distance weighted interpolation (DBIDWI) algorithm is one of the most dominant and successful denoising algorithm, which is recently proposed in 2017, however this DBIDWI algorithm is not desired for denoising the full dynamic intensity range image, which is comprised of min or max intensity. Consequently, the research article aims to study the performance and its limitation of the DBIDWI algorithm when the DBIDWI algorithm is performed in both general i
APA, Harvard, Vancouver, ISO, and other styles
9

Lu, Kuan, Song Gao, Pang Xiangkun, Zhu lingkai, Xiangrong Meng, and Wenxue Sun. "Multi-layer Long Short-term Memory based Condenser Vacuum Degree Prediction Model on Power Plant." E3S Web of Conferences 136 (2019): 01012. http://dx.doi.org/10.1051/e3sconf/201913601012.

Full text
Abstract:
A multi-layer LSTM (Long short-term memory) model is proposed for condenser vacuum degree prediction of power plants. Firstly, Min-max normalization is used to pre-process the input data. Then, the model proposes the two-layer LSTM architecture to identify the time series pattern effectively. ADAM(Adaptive moment)optimizer is selected to find the optimum parameters for the model during training. Under the proposed forecasting framework, experiments illustrates that the two-layer LSTM model can give a more accurate forecast to the condenser vacuum degree compared with other simple RNN (Recurren
APA, Harvard, Vancouver, ISO, and other styles
10

Sharma, Nikhil, Prateek Jeet Singh Sohi, and Bharat Garg. "An Adaptive Weighted Min-Mid-Max Value Based Filter for Eliminating High Density Impulsive Noise." Wireless Personal Communications 119, no. 3 (2021): 1975–92. http://dx.doi.org/10.1007/s11277-021-08314-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Conference papers on the topic "Weighted adaptive min-max normalization"

1

Patel, Chetan, Aarsh Pandey, Rajesh Wadhvani, and Deepali Patil. "Forecasting Nonstationary Wind Data Using Adaptive Min-Max Normalization." In 2022 1st International Conference on Sustainable Technology for Power and Energy Systems (STPES). IEEE, 2022. http://dx.doi.org/10.1109/stpes54845.2022.10006473.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!