Academic literature on the topic 'Natural gradient descent'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Natural gradient descent.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Natural gradient descent"
Stokes, James, Josh Izaac, Nathan Killoran, and Giuseppe Carleo. "Quantum Natural Gradient." Quantum 4 (May 25, 2020): 269. http://dx.doi.org/10.22331/q-2020-05-25-269.
Full textRattray, Magnus, David Saad, and Shun-ichi Amari. "Natural Gradient Descent for On-Line Learning." Physical Review Letters 81, no. 24 (December 14, 1998): 5461–64. http://dx.doi.org/10.1103/physrevlett.81.5461.
Full textHeskes, Tom. "On “Natural” Learning and Pruning in Multilayered Perceptrons." Neural Computation 12, no. 4 (April 1, 2000): 881–901. http://dx.doi.org/10.1162/089976600300015637.
Full textRattray, Magnus, and David Saad. "Analysis of natural gradient descent for multilayer neural networks." Physical Review E 59, no. 4 (April 1, 1999): 4523–32. http://dx.doi.org/10.1103/physreve.59.4523.
Full textInoue, Masato, Hyeyoung Park, and Masato Okada. "On-Line Learning Theory of Soft Committee Machines with Correlated Hidden Units –Steepest Gradient Descent and Natural Gradient Descent–." Journal of the Physical Society of Japan 72, no. 4 (April 15, 2003): 805–10. http://dx.doi.org/10.1143/jpsj.72.805.
Full textZhao, Pu, Pin-yu Chen, Siyue Wang, and Xue Lin. "Towards Query-Efficient Black-Box Adversary with Zeroth-Order Natural Gradient Descent." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 6909–16. http://dx.doi.org/10.1609/aaai.v34i04.6173.
Full textYang, Howard Hua, and Shun-ichi Amari. "Complexity Issues in Natural Gradient Descent Method for Training Multilayer Perceptrons." Neural Computation 10, no. 8 (November 1, 1998): 2137–57. http://dx.doi.org/10.1162/089976698300017007.
Full textPark, Hyeyoung, and Kwanyong Lee. "Adaptive Natural Gradient Method for Learning of Stochastic Neural Networks in Mini-Batch Mode." Applied Sciences 9, no. 21 (October 28, 2019): 4568. http://dx.doi.org/10.3390/app9214568.
Full textMUKUNO, Jun-ichi, and Hajime MATSUI. "Natural Gradient Descent of Complex-Valued Neural Networks Invariant under Rotations." IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences E102.A, no. 12 (December 1, 2019): 1988–96. http://dx.doi.org/10.1587/transfun.e102.a.1988.
Full textNeumann, K., C. Strub, and J. J. Steil. "Intrinsic plasticity via natural gradient descent with application to drift compensation." Neurocomputing 112 (July 2013): 26–33. http://dx.doi.org/10.1016/j.neucom.2012.12.047.
Full textDissertations / Theses on the topic "Natural gradient descent"
Inoue, Masato. "On-line learning theory of soft committee machines with correlated hidden units : Steepest gradient descent and natural gradient descent." Kyoto University, 2003. http://hdl.handle.net/2433/148746.
Full textAguiar, Eliane Martins de. "Aplicação do Word2vec e do Gradiente descendente dstocástico em tradução automática." reponame:Repositório Institucional do FGV, 2016. http://hdl.handle.net/10438/16798.
Full textApproved for entry into archive by Janete de Oliveira Feitosa (janete.feitosa@fgv.br) on 2016-08-03T20:29:34Z (GMT) No. of bitstreams: 1 dissertacao-ElianeMartins.pdf: 6062037 bytes, checksum: 14567c2feca25a81d6942be3b8bc8a65 (MD5)
Approved for entry into archive by Maria Almeida (maria.socorro@fgv.br) on 2016-08-23T20:12:35Z (GMT) No. of bitstreams: 1 dissertacao-ElianeMartins.pdf: 6062037 bytes, checksum: 14567c2feca25a81d6942be3b8bc8a65 (MD5)
Made available in DSpace on 2016-08-23T20:12:54Z (GMT). No. of bitstreams: 1 dissertacao-ElianeMartins.pdf: 6062037 bytes, checksum: 14567c2feca25a81d6942be3b8bc8a65 (MD5) Previous issue date: 2016-05-30
O word2vec é um sistema baseado em redes neurais que processa textos e representa pa- lavras como vetores, utilizando uma representação distribuída. Uma propriedade notável são as relações semânticas encontradas nos modelos gerados. Este trabalho tem como objetivo treinar dois modelos utilizando o word2vec, um para o Português e outro para o Inglês, e utilizar o gradiente descendente estocástico para encontrar uma matriz de tradução entre esses dois espaços.
Casero, Cañas Ramón. "Left ventricle functional analysis in 2D+t contrast echocardiography within an atlas-based deformable template model framework." Thesis, University of Oxford, 2008. http://ora.ox.ac.uk/objects/uuid:b17b3670-551d-4549-8f10-d977295c1857.
Full text"Adaptive Curvature for Stochastic Optimization." Master's thesis, 2019. http://hdl.handle.net/2286/R.I.53675.
Full textDissertation/Thesis
Masters Thesis Computer Science 2019
Book chapters on the topic "Natural gradient descent"
Yang, H. H., and S. Amari. "Statistical Learning by Natural Gradient Descent." In New Learning Paradigms in Soft Computing, 1–29. Heidelberg: Physica-Verlag HD, 2002. http://dx.doi.org/10.1007/978-3-7908-1803-1_1.
Full textIbnkahla, Mohamed. "Nonlinear Channel Identification Using Natural Gradient Descent: Application to Modeling and Tracking." In Soft Computing in Communications, 55–70. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-45090-0_3.
Full textA B, Pawar, Jawale M A, and Kyatanavar D N. "Analyzing Fake News Based on Machine Learning Algorithms." In Intelligent Systems and Computer Technology. IOS Press, 2020. http://dx.doi.org/10.3233/apc200146.
Full textNaik, Bighnaraj, Janmenjoy Nayak, and H. S. Behera. "A Hybrid Model of FLANN and Firefly Algorithm for Classification." In Handbook of Research on Natural Computing for Optimization Problems, 491–522. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-5225-0058-2.ch021.
Full textNayak, Sarat Chandra, Bijan Bihari Misra, and Himansu Sekhar Behera. "Improving Performance of Higher Order Neural Network using Artificial Chemical Reaction Optimization." In Advances in Computational Intelligence and Robotics, 253–80. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-5225-0063-6.ch011.
Full textNarayanan, Swathi Jamjala, Boominathan Perumal, and Jayant G. Rohra. "Swarm-Based Nature-Inspired Metaheuristics for Neural Network Optimization." In Advances in Computational Intelligence and Robotics, 23–53. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-2857-9.ch002.
Full textMukhopadhyay, Sumitra, and Soumyadip Das. "Application of Nature-Inspired Algorithms for Sensing Error Optimisation in Dynamic Environment." In Nature-Inspired Algorithms for Big Data Frameworks, 124–69. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-5852-1.ch006.
Full textBenes, Peter Mark, Miroslav Erben, Martin Vesely, Ondrej Liska, and Ivo Bukovsky. "HONU and Supervised Learning Algorithms in Adaptive Feedback Control." In Advances in Computational Intelligence and Robotics, 35–60. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-5225-0063-6.ch002.
Full textConference papers on the topic "Natural gradient descent"
Hong, Yuan, Changhao Xia, Shixiang Zhang, Lin Wu, Chao Yuan, Ying Huang, Xuxu Wang, and Haifeng Zhu. "Load forecasting using elastic gradient descent." In 2013 9th International Conference on Natural Computation (ICNC). IEEE, 2013. http://dx.doi.org/10.1109/icnc.2013.6817979.
Full textAji, Alham Fikri, and Kenneth Heafield. "Sparse Communication for Distributed Gradient Descent." In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/d17-1045.
Full textMalago, Luigi, Matteucci Matteo, and Giovanni Pistone. "Stochastic Natural Gradient Descent by estimation of empirical covariances." In 2011 IEEE Congress on Evolutionary Computation (CEC). IEEE, 2011. http://dx.doi.org/10.1109/cec.2011.5949720.
Full textIzadi, Mohammad Rasool, Yihao Fang, Robert Stevenson, and Lizhen Lin. "Optimization of Graph Neural Networks with Natural Gradient Descent." In 2020 IEEE International Conference on Big Data (Big Data). IEEE, 2020. http://dx.doi.org/10.1109/bigdata50022.2020.9378063.
Full textZhijian Luo, Danping Liao, and Yuntao Qian. "Bound analysis of natural gradient descent in stochastic optimization setting." In 2016 23rd International Conference on Pattern Recognition (ICPR). IEEE, 2016. http://dx.doi.org/10.1109/icpr.2016.7900287.
Full textBogoychev, Nikolay, Kenneth Heafield, Alham Fikri Aji, and Marcin Junczys-Dowmunt. "Accelerating Asynchronous Stochastic Gradient Descent for Neural Machine Translation." In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018. http://dx.doi.org/10.18653/v1/d18-1332.
Full textCheng, Keyang, Fei Tao, and Jianming Zhang. "A Stochastic Parallel Gradient Descent Algorithem for Pedestrian Re-identification." In 2018 14th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD). IEEE, 2018. http://dx.doi.org/10.1109/fskd.2018.8686843.
Full textKhan, Mohammad Emtiyaz, and Didrik Nielsen. "Fast yet Simple Natural-Gradient Descent for Variational Inference in Complex Models." In 2018 International Symposium on Information Theory and Its Applications (ISITA). IEEE, 2018. http://dx.doi.org/10.23919/isita.2018.8664326.
Full textIbnkahla, M., and J. Yuan. "A neural network MLSE receiver based on natural gradient descent: application to satellite communications." In Seventh International Symposium on Signal Processing and Its Applications, 2003. Proceedings. IEEE, 2003. http://dx.doi.org/10.1109/isspa.2003.1224633.
Full textPrellberg, Jonas, and Oliver Kramer. "Learned Weight Sharing for Deep Multi-Task Learning by Natural Evolution Strategy and Stochastic Gradient Descent." In 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9207139.
Full text