Academic literature on the topic 'Rectified Linear Unit (ReLU)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Rectified Linear Unit (ReLU).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Rectified Linear Unit (ReLU)"

1

Abdeljawad, Ahmed, and Philipp Grohs. "Approximations with deep neural networks in Sobolev time-space." Analysis and Applications 20, no. 03 (2022): 499–541. http://dx.doi.org/10.1142/s0219530522500014.

Full text
Abstract:
Solutions of the evolution equation generally lie in certain Bochner–Sobolev spaces, in which the solutions may have regularity and integrability properties for the time variable that can be different for the space variables. Therefore, in this paper, we develop a framework that shows that deep neural networks can approximate Sobolev-regular functions with respect to Bochner–Sobolev spaces. In our work, we use the so-called Rectified Cubic Unit (ReCU) as an activation function in our networks. This activation function allows us to deduce approximation results of the neural networks while avoid
APA, Harvard, Vancouver, ISO, and other styles
2

Bai, Yuhan. "RELU-Function and Derived Function Review." SHS Web of Conferences 144 (2022): 02006. http://dx.doi.org/10.1051/shsconf/202214402006.

Full text
Abstract:
The activation function plays an important role in training and improving performance in deep neural networks (dnn). The rectified linear unit (relu) function provides the necessary non-linear properties in the deep neural network (dnn). However, few papers sort out and compare various relu activation functions. Most of the paper focuses on the efficiency and accuracy of certain activation functions used by the model, but does not pay attention to the nature and differences of these activation functions. Therefore, this paper attempts to organize the RELU-function and derived function in this
APA, Harvard, Vancouver, ISO, and other styles
3

Garg, Shruti, Soumyajit Behera, K. Rahul Patro, and Ashwani Garg. "Deep Neural Network for Electroencephalogram based Emotion Recognition." IOP Conference Series: Materials Science and Engineering 1187, no. 1 (2021): 012012. http://dx.doi.org/10.1088/1757-899x/1187/1/012012.

Full text
Abstract:
Abstract Emotion recognition using electroencephalogram (EEG) signals is an aspect of affective computing. The EEG refers to recording brain responses via electrical signals by showing external stimuli to the participants. This paper proposes the prediction of valence, arousal, dominance and liking for EEG signals using a deep neural network (DNN). The EEG data is obtained from the AMIGOS dataset, a publicly available dataset for mood and personality research. Two features, normalized and power and normalized wavelet energy, are extracted using Fourier and wavelet transform, respectively. A DN
APA, Harvard, Vancouver, ISO, and other styles
4

Katende, Ronald, Henry Kasumba, Godwin Kakuba, and John M. Mango. "A proof of convergence and equivalence for 1D finite element methods and ReLU neural networks." Annals of Mathematics and Computer Science 25 (November 16, 2024): 97–111. http://dx.doi.org/10.56947/amcs.v25.392.

Full text
Abstract:
This paper investigates the convergence and equivalence properties of the Finite Element Method (FEM) and Rectified Linear Unit Neural Networks (ReLU NNs) in solving differential equations. We provide a detailed comparison of the two approaches, highlighting their mutual capabilities in function space approximation. Our analysis proves the subset and superset inclusions that establish the equivalence between FEM and ReLU NNs for approximating piecewise linear functions. Furthermore, a comprehensive numerical evaluation is presented, demonstrating the error convergence behavior of ReLU NNs as t
APA, Harvard, Vancouver, ISO, and other styles
5

McCarty, Sarah. "Piecewise linear functions representable with infinite width shallow ReLU neural networks." Proceedings of the American Mathematical Society, Series B 10, no. 27 (2023): 296–310. http://dx.doi.org/10.1090/bproc/186.

Full text
Abstract:
This paper analyzes representations of continuous piecewise linear functions with infinite width, finite cost shallow neural networks using the rectified linear unit (ReLU) as an activation function. Through its integral representation, a shallow neural network can be identified by the corresponding signed, finite measure on an appropriate parameter space. We map these measures on the parameter space to measures on the projective n n -sphere cross R \mathbb {R} , allowing points in the parameter space to be bijectively mapped to hyperplanes in the domain of the function. We prove a conjecture
APA, Harvard, Vancouver, ISO, and other styles
6

Tedyyana, Agus, Osman Ghazali, Onno W. Purbo, and Mohamad Amir Abu Seman. "Enhancing intrusion detection system using rectified linear unit function in pigeon inspired optimization algorithm." IAES International Journal of Artificial Intelligence (IJ-AI) 13, no. 2 (2024): 1526. http://dx.doi.org/10.11591/ijai.v13.i2.pp1526-1534.

Full text
Abstract:
The increasing rate of cybercrime in the digital world highlights the importance of having a reliable intrusion detection system (IDS) to detect unauthorized attacks and notify administrators. IDS can leverage machine learning techniques to identify patterns of attacks and provide real-time notifications. In building a successful IDS, selecting the right features is crucial as it determines the accuracy of the predictions made by the model. This paper presents a new IDS algorithm that combines the rectified linear unit (ReLU) activation function with a pigeon-inspired optimizer in feature sele
APA, Harvard, Vancouver, ISO, and other styles
7

Agus, Tedyyana, Ghazali Osman, and W. Purbo Onno. "Enhancing intrusion detection system using rectified linear unit function in pigeon inspired optimization algorithm." IAES International Journal of Artificial Intelligence (IJ-AI) 13, no. 2 (2024): 1526–34. https://doi.org/10.11591/ijai.v13.i2.pp1526-1534.

Full text
Abstract:
The increasing rate of cybercrime in the digital world highlights the importance of having a reliable intrusion detection system (IDS) to detect unauthorized attacks and notify administrators. IDS can leverage machine learning techniques to identify patterns of attacks and provide real-time notifications. In building a successful IDS, selecting the right features is crucial as it determines the accuracy of the predictions made by the model. This paper presents a new IDS algorithm that combines the rectified linear unit (ReLU) activation function with a pigeon-inspired optimizer in feature sele
APA, Harvard, Vancouver, ISO, and other styles
8

Gühring, Ingo, Gitta Kutyniok, and Philipp Petersen. "Error bounds for approximations with deep ReLU neural networks in Ws,p norms." Analysis and Applications 18, no. 05 (2019): 803–59. http://dx.doi.org/10.1142/s0219530519410021.

Full text
Abstract:
We analyze to what extent deep Rectified Linear Unit (ReLU) neural networks can efficiently approximate Sobolev regular functions if the approximation error is measured with respect to weaker Sobolev norms. In this context, we first establish upper approximation bounds by ReLU neural networks for Sobolev regular functions by explicitly constructing the approximate ReLU neural networks. Then, we establish lower approximation bounds for the same type of function classes. A trade-off between the regularity used in the approximation norm and the complexity of the neural network can be observed in
APA, Harvard, Vancouver, ISO, and other styles
9

Almatroud, Othman Abdullah, Viet-Thanh Pham, and Karthikeyan Rajagopal. "A Rectified Linear Unit-Based Memristor-Enhanced Morris–Lecar Neuron Model." Mathematics 12, no. 19 (2024): 2970. http://dx.doi.org/10.3390/math12192970.

Full text
Abstract:
This paper introduces a modified Morris–Lecar neuron model that incorporates a memristor with a ReLU-based activation function. The impact of the memristor on the dynamics of the ML neuron model is analyzed using bifurcation diagrams and Lyapunov exponents. The findings reveal chaotic behavior within specific parameter ranges, while increased magnetic strength tends to maintain periodic dynamics. The emergence of various firing patterns, including periodic and chaotic spiking as well as square-wave and triangle-wave bursting is also evident. The modified model also demonstrates multistability
APA, Harvard, Vancouver, ISO, and other styles
10

Noprisson, Handrie, Vina Ayumi, Mariana Purba, and Nur Ani. "MOBILENET PERFORMANCE IMPROVEMENTS FOR DEEPFAKE IMAGE IDENTIFICATION USING ACTIVATION FUNCTION AND REGULARIZATION." JITK (Jurnal Ilmu Pengetahuan dan Teknologi Komputer) 10, no. 2 (2024): 441–48. http://dx.doi.org/10.33480/jitk.v10i2.5798.

Full text
Abstract:
Deepfake images are often used to spread false information, manipulate public opinion, and harm individuals by creating fake content, making developing deepfake detection technology essential to mitigate these potential dangers. This study utilized the MobileNet architecture by applying regularization and activation function methods to improve detection accuracy. ReLU (Rectified Linear Unit) enhances the model's efficiency and ability to capture non-linear features, while Dropout and L2 regularization help reduce overfitting by penalizing large weights, thereby improving generalization. Based
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Rectified Linear Unit (ReLU)"

1

Pradhan, Biswajeet, and Maher Ibrahim Sameen. "Manifestation of SVM-Based Rectified Linear Unit (ReLU) Kernel Function in Landslide Modelling." In Space Science and Communication for Sustainability. Springer Singapore, 2017. http://dx.doi.org/10.1007/978-981-10-6574-3_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Li, and Wenhao Chen. "A Rectified Linear Unit Model for Diagnosing VCSEL’s Power Output." In Communications in Computer and Information Science. Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-2810-1_46.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bede, Barnabas, Vladik Kreinovich, and Uyen Pham. "Why Rectified Linear Unit Is Efficient in Machine Learning: One More Explanation." In Machine Learning for Econometrics and Related Topics. Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-43601-7_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Xue, Yi-ming, Wan-li Peng, Yuzhu Wang, Juan Wen, and Ping Zhong. "Optimized CNN with Point-Wise Parametric Rectified Linear Unit for Spatial Image Steganalysis." In Digital Forensics and Watermarking. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-43575-2_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shi, Hong Wei, Li Shen Wang, Jiamin Moran Huang, and Jun Steed Huang. "Forest Environment Association Analysis for the Pandemic Health with Rectified Linear Unit Correlations." In Computational and Experimental Simulations in Engineering. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-02097-1_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gaber, Heba, Hatem Mohamed, and Mina Ibrahim. "Breast Cancer Classification from Histopathological Images with Separable Convolutional Neural Network and Parametric Rectified Linear Unit." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58669-0_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Tekwani, Bharti, and Archana Bohra Gupta. "An Application of Deep Learning Using Leaky Rectified Linear Unit and Hyperbolic Tangent in Non-destructive Testing." In Proceedings of International Conference on Computational Intelligence. Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-3526-6_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shafik, Wasswa, Ali Tufail, Liyanage Chandratilak De Silva, and Rosyzie Anna Haji Mohd Apong. "An Enhanced Deep Convolutional Neural Network for Plant Disease Detection and Classification." In Artificial Intelligence and Data Science for Sustainability. IGI Global, 2025. https://doi.org/10.4018/979-8-3693-6829-9.ch010.

Full text
Abstract:
This research introduces a novel enhanced deep convolutional neural network for plant disease detection and classification, a cutting-edge tool that is set to revolutionize the field. The study enhances the ResNet50 network by replacing the fully connected layer with three layers that improve discrimination and feature extraction, namely the convolution, batch normalization, and Leaky rectified linear unit (RELU) activation layers. Experimental performance assessments were conducted to evaluate the performance of the proposed model in comparison to the original ResNet50, EfficientNet, DesNet20
APA, Harvard, Vancouver, ISO, and other styles
9

Jayashree, S., B. Jansi, and Dr V. Sumalatha. "EXPLORING ACTIVATION FUNCTIONS: A COMPREHENSIVE STUDY ON ENHANCING CONVENTIONAL NEURAL NETWORK LEARNING." In Futuristic Trends in Information Technology Volume 3 Book 3. Iterative International Publishers, Selfypage Developers Pvt Ltd, 2024. http://dx.doi.org/10.58532/v3bkit3p1ch2.

Full text
Abstract:
Activation functions are essential components in neural networks as they introduce non-linearity, enabling the model to learn complex relationships in the data. Their role in enhancing the learning capabilities of conventional neural networks is crucial to achieve high performance in various tasks. This comprehensive study delves into the world of activation functions, examining their characteristics, advantages, and limitations, with a focus on enhancing the learning process of conventional neural networks. Various activation functions are meticulously analyzed to understand their impact on n
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Rectified Linear Unit (ReLU)"

1

Moreno-Palancas, Isabela Fons, Raquel Salcedo D�az, Rub�n Ruiz Femenia, and Jos� A. Caballero. "Handling discrete decisions in bilevel optimization via neural network embeddings." In The 35th European Symposium on Computer Aided Process Engineering. PSE Press, 2025. https://doi.org/10.69997/sct.175350.

Full text
Abstract:
Bilevel optimization is an active area of research within the operations research community due to its ability to capture the interdependencies between two levels of decisions. This study introduces a metamodeling approach for addressing mixed-integer bilevel optimization problems, exploiting the approximation capabilities of neural networks. The proposed methodology employs neural network embeddings to approximate the optimal follower's response, bypassing the inner optimization problem by parametrizing it with continuous leader�s decisions. The use of Rectified Linear Unit (ReLU) activations
APA, Harvard, Vancouver, ISO, and other styles
2

Kumar, Ajay, and Abhimanyu Singh Panwar. "Human Mental State Detection Using Modified Convolutional Neural Network with Leaky Rectified Linear Unit." In 2024 IEEE Region 10 Symposium (TENSYMP). IEEE, 2024. http://dx.doi.org/10.1109/tensymp61132.2024.10752185.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

S, Amutha, Umang Soni, Ganesh Pandit Pathak, and Parimala Veluvali. "Enhancing Intrusion Detection Systems with Rectified Linear Unit Function in Pigeon-Inspired Optimization Algorithm." In 2025 3rd International Conference on Data Science and Information System (ICDSIS). IEEE, 2025. https://doi.org/10.1109/icdsis65355.2025.11070926.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhou, Ruhao, and Ling Li. "Digital Storage and Inheritance of Folk Art using Long Short-Term Memory with Leaky Rectified Linear Unit." In 2024 Second International Conference on Networks, Multimedia and Information Technology (NMITCON). IEEE, 2024. http://dx.doi.org/10.1109/nmitcon62075.2024.10699134.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Ying. "Detection of Petrochemical Pollutants using Multi Mutation Tunicate Swarm Optimization Algorithm based Convolutional Neural Network with Leaky Rectified Linear Unit." In 2024 International Conference on Distributed Systems, Computer Networks and Cybersecurity (ICDSCNC). IEEE, 2024. https://doi.org/10.1109/icdscnc62492.2024.10939088.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Masud, Md Abdullah Al, Alazar Araia, Yuxin Wang, Jianli Hu, and Yuhe Tian. "Machine Learning-Aided Process Design for Microwave-Assisted Ammonia Production." In Foundations of Computer-Aided Process Design. PSE Press, 2024. http://dx.doi.org/10.69997/sct.121422.

Full text
Abstract:
Machine learning (ML) has become a powerful tool to analyze complex relationships between multiple variables and to unravel valuable information from big datasets. However, an open research question lies in how ML can accelerate the design and optimization of processes in the early experimental development stages with limited data. In this work, we investigate the ML-aided process design of a microwave reactor for ammonia production with exceedingly little experimental data. We propose an integrated approach of synthetic minority oversampling technique (SMOTE) regression combined with neural n
APA, Harvard, Vancouver, ISO, and other styles
7

Stephanie, Margareta V., Lam Pham, Alexander Schindler, Michael Waltl, Tibor Grasser, and Bernhard Schrenk. "Neural Network with Optical Frequency-Coded ReLU." In Optical Fiber Communication Conference. Optica Publishing Group, 2024. http://dx.doi.org/10.1364/ofc.2024.m4c.2.

Full text
Abstract:
We demonstrate a photonic rectified linear unit (ReLU) function accomplished through frequency-coded neural signals. We show operation of an optical neuron with weighted sum and ReLU activation to perform with a 1% penalty in accuracy.
APA, Harvard, Vancouver, ISO, and other styles
8

Sooksatra, Korn, Greg Hamerly, and Pablo Rivas. "Is ReLU Adversarially Robust?" In LatinX in AI at International Conference on Machine Learning 2023. Journal of LatinX in AI Research, 2023. http://dx.doi.org/10.52591/lxai202307232.

Full text
Abstract:
The efficacy of deep learning models has been called into question by the presence of adversarial examples. Addressing the vulnerability of deep learning models to adversarial examples is crucial for ensuring their continued development and deployment. In this work, we focus on the role of rectified linear unit (ReLU) activation functions in the generation of adversarial examples. ReLU functions are commonly used in deep learning models because they facilitate the training process. However, our empirical analysis demonstrates that ReLU functions are not robust against adversarial examples. We
APA, Harvard, Vancouver, ISO, and other styles
9

Kessler, Travis, Gregory Dorian, and J. Hunter Mack. "Application of a Rectified Linear Unit (ReLU) Based Artificial Neural Network to Cetane Number Predictions." In ASME 2017 Internal Combustion Engine Division Fall Technical Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/icef2017-3614.

Full text
Abstract:
Due to the high cost and time required to synthesize alternative fuel candidates for comprehensive testing, an Artificial Neural Network (ANN) can be used to predict fuel properties, allowing researchers to preemptively screen desirable fuel candidates. However, the accuracy of an ANN is limited by its error, measured by the root mean square error (RMSE), standard deviation, and r-squared values derived from a given input database. The present work improves upon an existing model for predicting the Cetane Number (CN) by changing the neuron activation function of the ANN from sigmoid to rectifi
APA, Harvard, Vancouver, ISO, and other styles
10

Nelson, Daniel, Steven Kiyabu, Timothy Vincent, Andrew Gillman, Amanda Criner, and Philip R. Buskohl. "Spectral Analysis of Mechanical Reservoir Computing With ReLU Spring Networks." In ASME 2024 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. American Society of Mechanical Engineers, 2024. http://dx.doi.org/10.1115/smasis2024-141014.

Full text
Abstract:
Abstract Nonlinear dynamics are a pervasive phenomenon in natural and synthetic mechanical systems, which can be leveraged for novel control of vibrations and elastic wave propagation. A mechanical system with high dimensionality and nonlinear dynamics can perform information processing on the physical stimuli that act upon the system. This information processing (known as physical reservoir computing) results from the dimensional expansion that occurs in the dynamic system’s state as it reacts to the input. The enriched signal contains both nonlinear transformations as well as memory of the o
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!