Journal articles on the topic 'ReLU'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 journal articles for your research on the topic 'ReLU.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.
Korch, Shaleen B., Heidi Contreras, and Josephine E. Clark-Curtiss. "Three Mycobacterium tuberculosis Rel Toxin-Antitoxin Modules Inhibit Mycobacterial Growth and Are Expressed in Infected Human Macrophages." Journal of Bacteriology 191, no. 5 (2008): 1618–30. http://dx.doi.org/10.1128/jb.01318-08.
Full textTrudel, Eric. "Saussure relu." Semiotica 2017, no. 217 (2017): 263–69. http://dx.doi.org/10.1515/sem-2016-0059.
Full textMa, Zhongkui, Jiaying Li, and Guangdong Bai. "ReLU Hull Approximation." Proceedings of the ACM on Programming Languages 8, POPL (2024): 2260–87. http://dx.doi.org/10.1145/3632917.
Full textLiang, XingLong, and Jun Xu. "Biased ReLU neural networks." Neurocomputing 423 (January 2021): 71–79. http://dx.doi.org/10.1016/j.neucom.2020.09.050.
Full textSajadi-Rosen, N. "Michon lu et relu." French Studies 66, no. 3 (2012): 427–28. http://dx.doi.org/10.1093/fs/kns132.
Full textBai, Yuhan. "RELU-Function and Derived Function Review." SHS Web of Conferences 144 (2022): 02006. http://dx.doi.org/10.1051/shsconf/202214402006.
Full textLayton, Oliver W., Siyuan Peng, and Scott T. Steinmetz. "ReLU, Sparseness, and the Encoding of Optic Flow in Neural Networks." Sensors 24, no. 23 (2024): 7453. http://dx.doi.org/10.3390/s24237453.
Full textHanin, Boris. "Universal Function Approximation by Deep Neural Nets with Bounded Width and ReLU Activations." Mathematics 7, no. 10 (2019): 992. http://dx.doi.org/10.3390/math7100992.
Full textNaufal, Budiman, Adi Kusworo, and Wibowo Adi. "Impact of Activation Function on the Performance of Convolutional Neural Network in Identifying Oil Palm Fruit Ripeness." International Journal of Mathematics and Computer Research 13, no. 04 (2025): 5107–13. https://doi.org/10.5281/zenodo.15261476.
Full textHarvey, David R. "RELU Special Issue: Editorial Reflections." Journal of Agricultural Economics 57, no. 2 (2006): 329–36. http://dx.doi.org/10.1111/j.1477-9552.2006.00055.x.
Full textDittmer, Soren, Emily J. King, and Peter Maass. "Singular Values for ReLU Layers." IEEE Transactions on Neural Networks and Learning Systems 31, no. 9 (2020): 3594–605. http://dx.doi.org/10.1109/tnnls.2019.2945113.
Full textKulathunga, Nalinda, Nishath Rajiv Ranasinghe, Daniel Vrinceanu, Zackary Kinsman, Lei Huang, and Yunjiao Wang. "Effects of Nonlinearity and Network Architecture on the Performance of Supervised Neural Networks." Algorithms 14, no. 2 (2021): 51. http://dx.doi.org/10.3390/a14020051.
Full textHuang, Changcun. "ReLU Networks Are Universal Approximators via Piecewise Linear or Constant Functions." Neural Computation 32, no. 11 (2020): 2249–78. http://dx.doi.org/10.1162/neco_a_01316.
Full textDung, D., V. K. Nguyen, and M. X. Thao. "ON COMPUTATION COMPLEXITY OF HIGH-DIMENSIONAL APPROXIMATION BY DEEP ReLU NEURAL NETWORKS." BULLETIN of L.N. Gumilyov Eurasian National University. MATHEMATICS. COMPUTER SCIENCE. MECHANICS Series 133, no. 4 (2020): 8–18. http://dx.doi.org/10.32523/2616-7182/2020-133-4-8-18.
Full textKatende, Ronald, Henry Kasumba, Godwin Kakuba, and John M. Mango. "A proof of convergence and equivalence for 1D finite element methods and ReLU neural networks." Annals of Mathematics and Computer Science 25 (November 16, 2024): 97–111. http://dx.doi.org/10.56947/amcs.v25.392.
Full textChieng, Hock Hung, Noorhaniza Wahid, Ong Pauline, and Sai Raj Kishore Perla. "Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning." International Journal of Advances in Intelligent Informatics 4, no. 2 (2018): 76. http://dx.doi.org/10.26555/ijain.v4i2.249.
Full textButt, F. M., L. Hussain, S. H. M. Jafri, et al. "Optimizing Parameters of Artificial Intelligence Deep Convolutional Neural Networks (CNN) to improve Prediction Performance of Load Forecasting System." IOP Conference Series: Earth and Environmental Science 1026, no. 1 (2022): 012028. http://dx.doi.org/10.1088/1755-1315/1026/1/012028.
Full textPurnawansyah, Purnawansyah, Haviluddin Haviluddin, Herdianti Darwis, Huzain Azis, and Yulita Salim. "Backpropagation Neural Network with Combination of Activation Functions for Inbound Traffic Prediction." Knowledge Engineering and Data Science 4, no. 1 (2021): 14. http://dx.doi.org/10.17977/um018v4i12021p14-28.
Full textRazali, Noor Fadzilah, Iza Sazanita Isa, Siti Noraini Sulaiman, Muhammad Khusairi Osman, Noor Khairiah A. Karim, and Dayang Suhaida Awang Damit. "Genetic algorithm-adapted activation function optimization of deep learning framework for breast mass cancer classification in mammogram images." International Journal of Electrical and Computer Engineering (IJECE) 15, no. 3 (2025): 2820. https://doi.org/10.11591/ijece.v15i3.pp2820-2833.
Full textManns, F. "Zacharie 12,10 relu en Jean 19,37." Liber Annuus 56 (January 2006): 301–10. http://dx.doi.org/10.1484/j.la.2.303646.
Full textDũng, Dinh, Van Kien Nguyen, and Mai Xuan Thao. "COMPUTATION COMPLEXITY OF DEEP RELU NEURAL NETWORKS IN HIGH-DIMENSIONAL APPROXIMATION." Journal of Computer Science and Cybernetics 37, no. 3 (2021): 291–320. http://dx.doi.org/10.15625/1813-9663/37/3/15902.
Full textGühring, Ingo, Gitta Kutyniok, and Philipp Petersen. "Error bounds for approximations with deep ReLU neural networks in Ws,p norms." Analysis and Applications 18, no. 05 (2019): 803–59. http://dx.doi.org/10.1142/s0219530519410021.
Full textGao, Hongyang, Lei Cai, and Shuiwang Ji. "Adaptive Convolutional ReLUs." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 3914–21. http://dx.doi.org/10.1609/aaai.v34i04.5805.
Full textKondra, Pranitha, and Naresh Vurukonda. "Feature Extraction and Classification of Gray-Scale Images of Brain Tumor using Deep Learning." Scalable Computing: Practice and Experience 25, no. 2 (2024): 1005–17. http://dx.doi.org/10.12694/scpe.v25i2.2456.
Full textKlusowski, Jason M., and Andrew R. Barron. "Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With $\ell^1$ and $\ell^0$ Controls." IEEE Transactions on Information Theory 64, no. 12 (2018): 7649–56. http://dx.doi.org/10.1109/tit.2018.2874447.
Full textHanoon, Faten Salim, and Abbas Hanon Hassin Alasadi. "A modified residual network for detection and classification of Alzheimer’s disease." International Journal of Electrical and Computer Engineering (IJECE) 12, no. 4 (2022): 4400. http://dx.doi.org/10.11591/ijece.v12i4.pp4400-4407.
Full textYahya, Ali Abdullah, Kui Liu, Ammar Hawbani, Yibin Wang, and Ali Naser Hadi. "A Novel Image Classification Method Based on Residual Network, Inception, and Proposed Activation Function." Sensors 23, no. 6 (2023): 2976. http://dx.doi.org/10.3390/s23062976.
Full textFaten, Salim Hanoon, and Hanon Hassin Alasadi Abbas. "A modified residual network for detection and classification of Alzheimer's disease." International Journal of Electrical and Computer Engineering (IJECE) 12, no. 4 (2022): 4400–4407. https://doi.org/10.11591/ijece.v12i4.pp4400-4407.
Full textNoprisson, Handrie, Vina Ayumi, Mariana Purba, and Nur Ani. "MOBILENET PERFORMANCE IMPROVEMENTS FOR DEEPFAKE IMAGE IDENTIFICATION USING ACTIVATION FUNCTION AND REGULARIZATION." JITK (Jurnal Ilmu Pengetahuan dan Teknologi Komputer) 10, no. 2 (2024): 441–48. http://dx.doi.org/10.33480/jitk.v10i2.5798.
Full textSalam, Abdulwahed, Abdelaaziz El Hibaoui, and Abdulgabbar Saif. "A comparison of activation functions in multilayer neural network for predicting the production and consumption of electricity power." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 1 (2021): 163. http://dx.doi.org/10.11591/ijece.v11i1.pp163-170.
Full textPattanaik, Abhipsa, and Leena Das. "DeepSkinNet: A Deep Learning Induced Skin Lesion Extraction System from Dermoscopic Images." International Journal of Online and Biomedical Engineering (iJOE) 21, no. 07 (2025): 15–28. https://doi.org/10.3991/ijoe.v21i07.54621.
Full textAbdulwahed, Salam, El Hibaoui Abdelaaziz, and Saif Abdulgabbar. "A comparison of activation functions in multilayer neural network for predicting the production and consumption of electricity power." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 1 (2021): 163–70. https://doi.org/10.11591/ijece.v11i1.pp163-170.
Full textAkter, Shahrin, and Mohammad Rafiqul Haider. "mTanh: A Low-Cost Inkjet-Printed Vanishing Gradient Tolerant Activation Function." Journal of Low Power Electronics and Applications 15, no. 2 (2025): 27. https://doi.org/10.3390/jlpea15020027.
Full textOpschoor, Joost A. A., Philipp C. Petersen, and Christoph Schwab. "Deep ReLU networks and high-order finite element methods." Analysis and Applications 18, no. 05 (2020): 715–70. http://dx.doi.org/10.1142/s0219530519410136.
Full textLiu, Bo, and Yi Liang. "Optimal function approximation with ReLU neural networks." Neurocomputing 435 (May 2021): 216–27. http://dx.doi.org/10.1016/j.neucom.2021.01.007.
Full textWang, Pichao, Xue Wang, Hao Luo, et al. "Scaled ReLU Matters for Training Vision Transformers." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 3 (2022): 2495–503. http://dx.doi.org/10.1609/aaai.v36i3.20150.
Full textDereich, Steffen, and Sebastian Kassing. "On minimal representations of shallow ReLU networks." Neural Networks 148 (April 2022): 121–28. http://dx.doi.org/10.1016/j.neunet.2022.01.006.
Full textChen, Zhi, and Pin-Han Ho. "Global-connected network with generalized ReLU activation." Pattern Recognition 96 (December 2019): 106961. http://dx.doi.org/10.1016/j.patcog.2019.07.006.
Full textBarbu, Adrian. "Training a Two-Layer ReLU Network Analytically." Sensors 23, no. 8 (2023): 4072. http://dx.doi.org/10.3390/s23084072.
Full textM Mesran, Sitti Rachmawati Yahya, Fifto Nugroho, and Agus Perdana Windarto. "Investigating the Impact of ReLU and Sigmoid Activation Functions on Animal Classification Using CNN Models." Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) 8, no. 1 (2024): 111–18. http://dx.doi.org/10.29207/resti.v8i1.5367.
Full textWindarto, Agus Perdana, Indra Riyana Rahadjeng, Muhammad Noor Hasan Siregar, and Muhammad Habib Yuhandri. "Optimization of the Activation Function for Predicting Inflation Levels to Increase Accuracy Values." JURNAL MEDIA INFORMATIKA BUDIDARMA 8, no. 3 (2024): 1627. http://dx.doi.org/10.30865/mib.v8i3.7776.
Full textMadhu, Golla, Sandeep Kautish, Khalid Abdulaziz Alnowibet, Hossam M. Zawbaa, and Ali Wagdy Mohamed. "NIPUNA: A Novel Optimizer Activation Function for Deep Neural Networks." Axioms 12, no. 3 (2023): 246. http://dx.doi.org/10.3390/axioms12030246.
Full textSanjaya, Andi, Endang Setyati, and Herman Budianto. "Model Architecture of CNN for Recognition the Pandava Mask." Inform : Jurnal Ilmiah Bidang Teknologi Informasi dan Komunikasi 5, no. 2 (2020): 99–104. http://dx.doi.org/10.25139/inform.v5i2.2740.
Full textXu, Xintao, Yi Liu, Gang Chen, Junbin Ye, Zhigang Li, and Huaxiang Lu. "A Cooperative Lightweight Translation Algorithm Combined with Sparse-ReLU." Computational Intelligence and Neuroscience 2022 (May 28, 2022): 1–12. http://dx.doi.org/10.1155/2022/4398839.
Full textPardede, Doughlas, Ichsan Firmansyah, Meli Handayani, Meisarah Riandini, and Rika Rosnelly. "COMPARISON OF MULTILAYER PERCEPTRON’S ACTIVATION AND OP-TIMIZATION FUNCTIONS IN CLASSIFICATION OF COVID-19 PATIENTS." JURTEKSI (Jurnal Teknologi dan Sistem Informasi) 8, no. 3 (2022): 271–78. http://dx.doi.org/10.33330/jurteksi.v8i3.1482.
Full textLee, Hyeonjeong, Jaewon Lee, and Miyoung Shin. "Using Wearable ECG/PPG Sensors for Driver Drowsiness Detection Based on Distinguishable Pattern of Recurrence Plots." Electronics 8, no. 2 (2019): 192. http://dx.doi.org/10.3390/electronics8020192.
Full textZheng, Shuxin, Qi Meng, Huishuai Zhang, Wei Chen, Nenghai Yu, and Tie-Yan Liu. "Capacity Control of ReLU Neural Networks by Basis-Path Norm." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5925–32. http://dx.doi.org/10.1609/aaai.v33i01.33015925.
Full textAzhary, Muhammad Zulhazmi Rafiqi, and Amelia Ritahani Ismail. "Comparative Performance of Different Convolutional Neural Network Activation Functions on Image Classification." International Journal on Perceptive and Cognitive Computing 10, no. 2 (2024): 118–22. http://dx.doi.org/10.31436/ijpcc.v10i2.490.
Full textMargolang, Khairul Fadhli, Sugeng Riyadi, Rika Rosnelly, and Wanayumini -. "Pengenalan Masker Wajah Menggunakan VGG-16 dan Multilayer Perceptron." Jurnal Telematika 17, no. 2 (2023): 80–87. http://dx.doi.org/10.61769/telematika.v17i2.519.
Full textDaniel, Irwan, Agus Fahmi Limas Ptr, and Aulia Ichsan. "Klasifikasi Risiko Penyakit Jantung Dengan Multilayer Perceptron." Data Sciences Indonesia (DSI) 4, no. 1 (2024): 78–82. https://doi.org/10.47709/dsi.v4i1.4667.
Full text