Статті в журналах з теми "Sparse deep neural networks"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Sparse deep neural networks".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.
Scardapane, Simone, Danilo Comminiello, Amir Hussain, and Aurelio Uncini. "Group sparse regularization for deep neural networks." Neurocomputing 241 (June 2017): 81–89. http://dx.doi.org/10.1016/j.neucom.2017.02.029.
Повний текст джерелаZang, Ke, Wenqi Wu, and Wei Luo. "Deep Sparse Learning for Automatic Modulation Classification Using Recurrent Neural Networks." Sensors 21, no. 19 (September 25, 2021): 6410. http://dx.doi.org/10.3390/s21196410.
Повний текст джерелаWu, Kailun, Yiwen Guo, and Changshui Zhang. "Compressing Deep Neural Networks With Sparse Matrix Factorization." IEEE Transactions on Neural Networks and Learning Systems 31, no. 10 (October 2020): 3828–38. http://dx.doi.org/10.1109/tnnls.2019.2946636.
Повний текст джерелаGangopadhyay, Briti, Pallab Dasgupta, and Soumyajit Dey. "Safety Aware Neural Pruning for Deep Reinforcement Learning (Student Abstract)." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 13 (June 26, 2023): 16212–13. http://dx.doi.org/10.1609/aaai.v37i13.26966.
Повний текст джерелаPetschenig, Horst, and Robert Legenstein. "Quantized rewiring: hardware-aware training of sparse deep neural networks." Neuromorphic Computing and Engineering 3, no. 2 (May 26, 2023): 024006. http://dx.doi.org/10.1088/2634-4386/accd8f.
Повний текст джерелаBelay, Kaleab. "Gradient and Mangitude Based Pruning for Sparse Deep Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 11 (June 28, 2022): 13126–27. http://dx.doi.org/10.1609/aaai.v36i11.21699.
Повний текст джерелаKaur, Mandeep, and Pradip Kumar Yadava. "A Review on Classification of Images with Convolutional Neural Networks." International Journal for Research in Applied Science and Engineering Technology 11, no. 7 (July 31, 2023): 658–63. http://dx.doi.org/10.22214/ijraset.2023.54704.
Повний текст джерелаBi, Jia, and Steve R. Gunn. "Sparse Deep Neural Network Optimization for Embedded Intelligence." International Journal on Artificial Intelligence Tools 29, no. 03n04 (June 2020): 2060002. http://dx.doi.org/10.1142/s0218213020600027.
Повний текст джерелаGallicchio, Claudio, and Alessio Micheli. "Fast and Deep Graph Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 3898–905. http://dx.doi.org/10.1609/aaai.v34i04.5803.
Повний текст джерелаTartaglione, Enzo, Andrea Bragagnolo, Attilio Fiandrotti, and Marco Grangetto. "LOss-Based SensiTivity rEgulaRization: Towards deep sparse neural networks." Neural Networks 146 (February 2022): 230–37. http://dx.doi.org/10.1016/j.neunet.2021.11.029.
Повний текст джерелаMa, Rongrong, Jianyu Miao, Lingfeng Niu та Peng Zhang. "Transformed ℓ1 regularization for learning sparse deep neural networks". Neural Networks 119 (листопад 2019): 286–98. http://dx.doi.org/10.1016/j.neunet.2019.08.015.
Повний текст джерелаZhao, Jin, and Licheng Jiao. "Fast Sparse Deep Neural Networks: Theory and Performance Analysis." IEEE Access 7 (2019): 74040–55. http://dx.doi.org/10.1109/access.2019.2920688.
Повний текст джерелаKarim, Ahmad M., Mehmet S. Güzel, Mehmet R. Tolun, Hilal Kaya, and Fatih V. Çelebi. "A New Generalized Deep Learning Framework Combining Sparse Autoencoder and Taguchi Method for Novel Data Classification and Processing." Mathematical Problems in Engineering 2018 (June 7, 2018): 1–13. http://dx.doi.org/10.1155/2018/3145947.
Повний текст джерелаLi, Yihang. "Sparse-Aware Deep Learning Accelerator." Highlights in Science, Engineering and Technology 39 (April 1, 2023): 305–10. http://dx.doi.org/10.54097/hset.v39i.6544.
Повний текст джерелаOhn, Ilsang, and Yongdai Kim. "Nonconvex Sparse Regularization for Deep Neural Networks and Its Optimality." Neural Computation 34, no. 2 (January 14, 2022): 476–517. http://dx.doi.org/10.1162/neco_a_01457.
Повний текст джерелаAvgerinos, Christos, Nicholas Vretos, and Petros Daras. "Less Is More: Adaptive Trainable Gradient Dropout for Deep Neural Networks." Sensors 23, no. 3 (January 24, 2023): 1325. http://dx.doi.org/10.3390/s23031325.
Повний текст джерелаHao, Yutong, Yunpeng Liu, Jinmiao Zhao, and Chuang Yu. "Dual-Domain Prior-Driven Deep Network for Infrared Small-Target Detection." Remote Sensing 15, no. 15 (July 31, 2023): 3827. http://dx.doi.org/10.3390/rs15153827.
Повний текст джерелаLee, Sangkyun, and Jeonghyun Lee. "Compressed Learning of Deep Neural Networks for OpenCL-Capable Embedded Systems." Applied Sciences 9, no. 8 (April 23, 2019): 1669. http://dx.doi.org/10.3390/app9081669.
Повний текст джерелаMousavi, Hamid, Mohammad Loni, Mina Alibeigi, and Masoud Daneshtalab. "DASS: Differentiable Architecture Search for Sparse Neural Networks." ACM Transactions on Embedded Computing Systems 22, no. 5s (September 9, 2023): 1–21. http://dx.doi.org/10.1145/3609385.
Повний текст джерелаAo, Ren, Zhang Tao, Wang Yuhao, Lin Sheng, Dong Peiyan, Chen Yen-kuang, Xie Yuan, and Wang Yanzhi. "DARB: A Density-Adaptive Regular-Block Pruning for Deep Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 5495–502. http://dx.doi.org/10.1609/aaai.v34i04.6000.
Повний текст джерелаÖstling, Robert. "Part of Speech Tagging: Shallow or Deep Learning?" Northern European Journal of Language Technology 5 (June 19, 2018): 1–15. http://dx.doi.org/10.3384/nejlt.2000-1533.1851.
Повний текст джерелаGong, Maoguo, Jia Liu, Hao Li, Qing Cai, and Linzhi Su. "A Multiobjective Sparse Feature Learning Model for Deep Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 26, no. 12 (December 2015): 3263–77. http://dx.doi.org/10.1109/tnnls.2015.2469673.
Повний текст джерелаBoo, Yoonho, and Wonyong Sung. "Compression of Deep Neural Networks with Structured Sparse Ternary Coding." Journal of Signal Processing Systems 91, no. 9 (November 6, 2018): 1009–19. http://dx.doi.org/10.1007/s11265-018-1418-z.
Повний текст джерелаZhao, Yao, Qingsong Liu, He Tian, Bingo Wing-Kuen Ling, and Zhe Zhang. "DeepRED Based Sparse SAR Imaging." Remote Sensing 16, no. 2 (January 5, 2024): 212. http://dx.doi.org/10.3390/rs16020212.
Повний текст джерелаWan, Xinyue, Bofeng Zhang, Guobing Zou, and Furong Chang. "Sparse Data Recommendation by Fusing Continuous Imputation Denoising Autoencoder and Neural Matrix Factorization." Applied Sciences 9, no. 1 (December 24, 2018): 54. http://dx.doi.org/10.3390/app9010054.
Повний текст джерелаEl-Yabroudi, Mohammad Z., Ikhlas Abdel-Qader, Bradley J. Bazuin, Osama Abudayyeh, and Rakan C. Chabaan. "Guided Depth Completion with Instance Segmentation Fusion in Autonomous Driving Applications." Sensors 22, no. 24 (December 7, 2022): 9578. http://dx.doi.org/10.3390/s22249578.
Повний текст джерелаQiao, Chen, Yan Shi, Yu-Xian Diao, Vince D. Calhoun, and Yu-Ping Wang. "Log-sum enhanced sparse deep neural network." Neurocomputing 407 (September 2020): 206–20. http://dx.doi.org/10.1016/j.neucom.2020.04.118.
Повний текст джерелаMorotti, Elena, Davide Evangelista, and Elena Loli Piccolomini. "A Green Prospective for Learned Post-Processing in Sparse-View Tomographic Reconstruction." Journal of Imaging 7, no. 8 (August 7, 2021): 139. http://dx.doi.org/10.3390/jimaging7080139.
Повний текст джерелаWan, Lulu, Tao Chen, Antonio Plaza, and Haojie Cai. "Hyperspectral Unmixing Based on Spectral and Sparse Deep Convolutional Neural Networks." IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 14 (2021): 11669–82. http://dx.doi.org/10.1109/jstars.2021.3126755.
Повний текст джерелаKhattak, Muhammad Irfan, Nasir Saleem, Jiechao Gao, Elena Verdu, and Javier Parra Fuente. "Regularized sparse features for noisy speech enhancement using deep neural networks." Computers and Electrical Engineering 100 (May 2022): 107887. http://dx.doi.org/10.1016/j.compeleceng.2022.107887.
Повний текст джерелаXie, Zhihua, Yi Li, Jieyi Niu, Ling Shi, Zhipeng Wang, and Guoyu Lu. "Hyperspectral face recognition based on sparse spectral attention deep neural networks." Optics Express 28, no. 24 (November 16, 2020): 36286. http://dx.doi.org/10.1364/oe.404793.
Повний текст джерелаLiu, Wei, Yue Yang, and Longsheng Wei. "Weather Recognition of Street Scene Based on Sparse Deep Neural Networks." Journal of Advanced Computational Intelligence and Intelligent Informatics 21, no. 3 (May 19, 2017): 403–8. http://dx.doi.org/10.20965/jaciii.2017.p0403.
Повний текст джерелаSchwab, Johannes, Stephan Antholzer, and Markus Haltmeier. "Big in Japan: Regularizing Networks for Solving Inverse Problems." Journal of Mathematical Imaging and Vision 62, no. 3 (October 3, 2019): 445–55. http://dx.doi.org/10.1007/s10851-019-00911-1.
Повний текст джерела.., Vani, and Piyush Kumar Pareek. "Deep Multiple Instance Learning Approach for Classification in Clinical Decision Support Systems." American Journal of Business and Operations Research 10, no. 2 (2023): 52–60. http://dx.doi.org/10.54216/ajbor.100206.
Повний текст джерелаHe, Haoyuan, Lingxuan Huang, Zisen Huang, and Tiantian Yang. "The Compression Techniques Applied on Deep Learning Model." Highlights in Science, Engineering and Technology 4 (July 26, 2022): 325–31. http://dx.doi.org/10.54097/hset.v4i.920.
Повний текст джерелаAlmulla Khalaf, Maysa Ibrahem, and John Q. Gan. "A three-stage learning algorithm for deep multilayer perceptron with effective weight initialisation based on sparse auto-encoder." Artificial Intelligence Research 8, no. 1 (April 2, 2019): 41. http://dx.doi.org/10.5430/air.v8n1p41.
Повний текст джерелаZahn, Olivia, Jorge Bustamante, Callin Switzer, Thomas L. Daniel, and J. Nathan Kutz. "Pruning deep neural networks generates a sparse, bio-inspired nonlinear controller for insect flight." PLOS Computational Biology 18, no. 9 (September 27, 2022): e1010512. http://dx.doi.org/10.1371/journal.pcbi.1010512.
Повний текст джерелаLiu, Xiao, Wenbin Li, Jing Huo, Lili Yao, and Yang Gao. "Layerwise Sparse Coding for Pruned Deep Neural Networks with Extreme Compression Ratio." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 4900–4907. http://dx.doi.org/10.1609/aaai.v34i04.5927.
Повний текст джерелаYao, Zhongtian, Kejie Huang, Haibin Shen, and Zhaoyan Ming. "Deep Neural Network Acceleration With Sparse Prediction Layers." IEEE Access 8 (2020): 6839–48. http://dx.doi.org/10.1109/access.2020.2963941.
Повний текст джерелаLee, Gwo-Chuan, Jyun-Hong Li, and Zi-Yang Li. "A Wasserstein Generative Adversarial Network–Gradient Penalty-Based Model with Imbalanced Data Enhancement for Network Intrusion Detection." Applied Sciences 13, no. 14 (July 12, 2023): 8132. http://dx.doi.org/10.3390/app13148132.
Повний текст джерелаPhan, Huy, Miao Yin, Yang Sui, Bo Yuan, and Saman Zonouz. "CSTAR: Towards Compact and Structured Deep Neural Networks with Adversarial Robustness." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 2 (June 26, 2023): 2065–73. http://dx.doi.org/10.1609/aaai.v37i2.25299.
Повний текст джерелаZhang, Hongwei, Jiacheng Ni, Kaiming Li, Ying Luo, and Qun Zhang. "Nonsparse SAR Scene Imaging Network Based on Sparse Representation and Approximate Observations." Remote Sensing 15, no. 17 (August 22, 2023): 4126. http://dx.doi.org/10.3390/rs15174126.
Повний текст джерелаGong, Zhenghui, Xiaolong Su, Panhe Hu, Shuowei Liu, and Zhen Liu. "Deep Unfolding Sparse Bayesian Learning Network for Off-Grid DOA Estimation with Nested Array." Remote Sensing 15, no. 22 (November 10, 2023): 5320. http://dx.doi.org/10.3390/rs15225320.
Повний текст джерелаChen, Yuanyuan, and Zhang Yi. "Adaptive sparse dropout: Learning the certainty and uncertainty in deep neural networks." Neurocomputing 450 (August 2021): 354–61. http://dx.doi.org/10.1016/j.neucom.2021.04.047.
Повний текст джерелаChen, Jiayu, Xiang Li, Vince D. Calhoun, Jessica A. Turner, Theo G. M. Erp, Lei Wang, Ole A. Andreassen, et al. "Sparse deep neural networks on imaging genetics for schizophrenia case–control classification." Human Brain Mapping 42, no. 8 (March 16, 2021): 2556–68. http://dx.doi.org/10.1002/hbm.25387.
Повний текст джерелаKovacs, Mate, and Victor V. Kryssanov. "Expanding the Feature Space of Deep Neural Networks for Sentiment Classification." International Journal of Machine Learning and Computing 10, no. 2 (February 2020): 271–76. http://dx.doi.org/10.18178/ijmlc.2020.10.2.931.
Повний текст джерелаLui, Hugo F. S., and William R. Wolf. "Construction of reduced-order models for fluid flows using deep feedforward neural networks." Journal of Fluid Mechanics 872 (June 14, 2019): 963–94. http://dx.doi.org/10.1017/jfm.2019.358.
Повний текст джерелаChen, Qipeng, Qiaoqiao Xiong, Haisong Huang, Saihong Tang, and Zhenghong Liu. "Research on the Construction of an Efficient and Lightweight Online Detection Method for Tiny Surface Defects through Model Compression and Knowledge Distillation." Electronics 13, no. 2 (January 5, 2024): 253. http://dx.doi.org/10.3390/electronics13020253.
Повний текст джерелаZhao, Yao, Chengwen Ou, He Tian, Bingo Wing-Kuen Ling, Ye Tian, and Zhe Zhang. "Sparse SAR Imaging Algorithm in Marine Environments Based on Memory-Augmented Deep Unfolding Network." Remote Sensing 16, no. 7 (April 5, 2024): 1289. http://dx.doi.org/10.3390/rs16071289.
Повний текст джерелаKohjima, Masahiro. "Shuffled Deep Regression." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 12 (March 24, 2024): 13238–45. http://dx.doi.org/10.1609/aaai.v38i12.29224.
Повний текст джерела