Academic literature on the topic 'Complexité de Rademacher'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Complexité de Rademacher.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Complexité de Rademacher"

1

El-Yaniv, R., and D. Pechyony. "Transductive Rademacher Complexity and its Applications." Journal of Artificial Intelligence Research 35 (June 22, 2009): 193–234. http://dx.doi.org/10.1613/jair.2587.

Full text
Abstract:
We develop a technique for deriving data-dependent error bounds for transductive learning algorithms based on transductive Rademacher complexity. Our technique is based on a novel general error bound for transduction in terms of transductive Rademacher complexity, together with a novel bounding technique for Rademacher averages for particular algorithms, in terms of their "unlabeled-labeled" representation. This technique is relevant to many advanced graph-based transductive algorithms and we demonstrate its effectiveness by deriving error bounds to three well known algorithms. Finally, we pre
APA, Harvard, Vancouver, ISO, and other styles
2

Oneto, Luca, Sandro Ridella, and Davide Anguita. "Local Rademacher Complexity Machine." Neurocomputing 342 (May 2019): 24–32. http://dx.doi.org/10.1016/j.neucom.2018.10.087.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lei, Yunwen, and Lixin Ding. "Refined Rademacher Chaos Complexity Bounds with Applications to the Multikernel Learning Problem." Neural Computation 26, no. 4 (2014): 739–60. http://dx.doi.org/10.1162/neco_a_00566.

Full text
Abstract:
Estimating the Rademacher chaos complexity of order two is important for understanding the performance of multikernel learning (MKL) machines. In this letter, we develop a novel entropy integral for Rademacher chaos complexities. As compared to the previous bounds, our result is much improved in that it introduces an adjustable parameter ε to prohibit the divergence of the involved integral. With the use of the iteration technique in Steinwart and Scovel ( 2007 ), we also apply our Rademacher chaos complexity bound to the MKL problems and improve existing learning rates.
APA, Harvard, Vancouver, ISO, and other styles
4

Ying, Yiming, and Colin Campbell. "Rademacher Chaos Complexities for Learning the Kernel Problem." Neural Computation 22, no. 11 (2010): 2858–86. http://dx.doi.org/10.1162/neco_a_00028.

Full text
Abstract:
We develop a novel generalization bound for learning the kernel problem. First, we show that the generalization analysis of the kernel learning problem reduces to investigation of the suprema of the Rademacher chaos process of order 2 over candidate kernels, which we refer to as Rademacher chaos complexity. Next, we show how to estimate the empirical Rademacher chaos complexity by well-established metric entropy integrals and pseudo-dimension of the set of candidate kernels. Our new methodology mainly depends on the principal theory of U-processes and entropy integrals. Finally, we establish s
APA, Harvard, Vancouver, ISO, and other styles
5

Han, Min, Di Rong Chen, and Zhao Xu Sun. "Rademacher complexity in Neyman-Pearson classification." Acta Mathematica Sinica, English Series 25, no. 5 (2009): 855–68. http://dx.doi.org/10.1007/s10114-008-6210-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lu, Yuwei. "Rademacher Complexity in Simplex/l∞ Set." Journal of Physics: Conference Series 1827, no. 1 (2021): 012145. http://dx.doi.org/10.1088/1742-6596/1827/1/012145.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Xu, Chang, Tongliang Liu, Dacheng Tao, and Chao Xu. "Local Rademacher Complexity for Multi-Label Learning." IEEE Transactions on Image Processing 25, no. 3 (2016): 1495–507. http://dx.doi.org/10.1109/tip.2016.2524207.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Guermeur, Yann. "Rademacher complexity of margin multi-category classifiers." Neural Computing and Applications 32, no. 24 (2018): 17995–8008. http://dx.doi.org/10.1007/s00521-018-3873-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lei, Yunwen, Lixin Ding, and Yingzhou Bi. "Local Rademacher complexity bounds based on covering numbers." Neurocomputing 218 (December 2016): 320–30. http://dx.doi.org/10.1016/j.neucom.2016.08.074.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Maximov, Yury, Massih-Reza Amini, and Zaid Harchaoui. "Rademacher Complexity Bounds for a Penalized Multi-class Semi-supervised Algorithm." Journal of Artificial Intelligence Research 61 (April 11, 2018): 761–86. http://dx.doi.org/10.1613/jair.5638.

Full text
Abstract:
We propose Rademacher complexity bounds for multi-class classifiers trained with a two-step semi-supervised model. In the first step, the algorithm partitions the partially labeled data and then identifies dense clusters containing k predominant classes using the labeled training examples such that the proportion of their non-predominant classes is below a fixed threshold stands for clustering consistency. In the second step, a classifier is trained by minimizing a margin empirical loss over the labeled training set and a penalization term measuring the disability of the learner to predict the
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Complexité de Rademacher"

1

El, Dakdouki Aya. "Machine à vecteurs de support hyperbolique et ingénierie du noyau." Thesis, Lille 1, 2019. http://www.theses.fr/2019LIL1I046/document.

Full text
Abstract:
La théorie statistique de l’apprentissage est un domaine de la statistique inférentielle dont les fondements ont été posés par Vapnik à la fin des années 60. Il est considéré comme un sous-domaine de l’intelligence artificielle. Dans l’apprentissage automatique, les machines à vecteurs de support (SVM) sont un ensemble de techniques d’apprentissage supervisé destinées à résoudre des problèmes de discrimination et de régression. Dans cette thèse, notre objectif est de proposer deux nouveaux problèmes d’aprentissagestatistique: Un portant sur la conception et l’évaluation d’une extension des SVM
APA, Harvard, Vancouver, ISO, and other styles
2

Nordenfors, Oskar. "A Literature Study Concerning Generalization Error Bounds for Neural Networks via Rademacher Complexity." Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-184487.

Full text
Abstract:
In this essay some fundamental results from the theory of machine learning and neural networks are presented, with the goal of finally discussing bounds on the generalization error of neural networks, via Rademacher complexity.<br>I denna uppsats presenteras några grundläggande resultat från teorin kring maskininlärning och neurala nätverk, med målet att slutligen diskutera övre begräsningar på generaliseringsfelet hos neurala nätverk, via Rademachers komplexitet.
APA, Harvard, Vancouver, ISO, and other styles
3

Philips, Petra Camilla, and petra philips@gmail com. "Data-Dependent Analysis of Learning Algorithms." The Australian National University. Research School of Information Sciences and Engineering, 2005. http://thesis.anu.edu.au./public/adt-ANU20050901.204523.

Full text
Abstract:
This thesis studies the generalization ability of machine learning algorithms in a statistical setting. It focuses on the data-dependent analysis of the generalization performance of learning algorithms in order to make full use of the potential of the actual training sample from which these algorithms learn.¶ First, we propose an extension of the standard framework for the derivation of generalization bounds for algorithms taking their hypotheses from random classes of functions. This approach is motivated by the fact that the function produced by a learning algorithm based on a random s
APA, Harvard, Vancouver, ISO, and other styles
4

(11211114), Qingyi Gao. "ADVERSARIAL LEARNING ON ROBUSTNESS AND GENERATIVE MODELS." Thesis, 2021.

Find full text
Abstract:
<div>In this dissertation, we study two important problems in the area of modern deep learning: adversarial robustness and adversarial generative model. In the first part, we study the generalization performance of deep neural networks (DNNs) in adversarial learning. Recent studies have shown that many machine learning models are vulnerable to adversarial attacks, but much remains unknown concerning its generalization error in this scenario. We focus on the $\ell_\infty$ adversarial attacks produced under the fast gradient sign method (FGSM). We establish a tight bound for the adversarial Rade
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Complexité de Rademacher"

1

Buhmann, M. D., Prem Melville, Vikas Sindhwani, et al. "Rademacher Complexity." In Encyclopedia of Machine Learning. Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_690.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Tolstikhin, Ilya, Nikita Zhivotovskiy, and Gilles Blanchard. "Permutational Rademacher Complexity." In Lecture Notes in Computer Science. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-24486-0_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Balle, Borja, and Mehryar Mohri. "On the Rademacher Complexity of Weighted Automata." In Lecture Notes in Computer Science. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-24486-0_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Maurer, Andreas. "The Rademacher Complexity of Linear Transformation Classes." In Learning Theory. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11776420_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

V’yugin, Vladimir V. "VC Dimension, Fat-Shattering Dimension, Rademacher Averages, and Their Applications." In Measures of Complexity. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-21852-6_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Zhe, Wenbo Jie, Daqi Gao, and Jin Xu. "Rademacher Complexity Analysis for Matrixized and Vectorized Classifier." In Recent Advances in Computer Science and Information Engineering. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-25781-0_107.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hsu, Kelvin, Richard Nock, and Fabio Ramos. "Hyperparameter Learning for Conditional Kernel Mean Embeddings with Rademacher Complexity Bounds." In Machine Learning and Knowledge Discovery in Databases. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-10928-8_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Katrenko, Sophia, and Menno van Zaanen. "Rademacher Complexity and Grammar Induction Algorithms: What It May (Not) Tell Us." In Grammatical Inference: Theoretical Results and Applications. Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15488-1_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Oneto, Luca, Davide Anguita, Alessandro Ghio, and Sandro Ridella. "Rademacher Complexity and Structural Risk Minimization: An Application to Human Gene Expression Datasets." In Artificial Neural Networks and Machine Learning – ICANN 2012. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-33266-1_61.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Klęsk, Przemysław. "A Comparison of Complexity Selection Approaches for Polynomials Based on: Vapnik-Chervonenkis Dimension, Rademacher Complexity and Covering Numbers." In Artificial Intelligence and Soft Computing. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-29350-4_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Complexité de Rademacher"

1

Zhou, Jiajia, Jianwei Liu, and Xionglin Luo. "Rademacher complexity bound for domain adaptation regression." In 2015 Conference on Technologies and Applications of Artificial Intelligence (TAAI). IEEE, 2015. http://dx.doi.org/10.1109/taai.2015.7407123.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Oneto, Luca, Alessandro Ghio, Sandro Ridella, and Davide Anguita. "Fast convergence of extended Rademacher Complexity bounds." In 2015 International Joint Conference on Neural Networks (IJCNN). IEEE, 2015. http://dx.doi.org/10.1109/ijcnn.2015.7280414.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Guermeur, Yann. "Rademacher complexity of margin multi-category classifiers." In 2017 12th International Workshop on Self-Organizing Maps and Learning Vector Quantization, Clustering and Data Visualization (WSOM). IEEE, 2017. http://dx.doi.org/10.1109/wsom.2017.8020034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Jian, Yong Liu, Rong Yin, and Weiping Wang. "Multi-Class Learning using Unlabeled Samples: Theory and Algorithm." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/399.

Full text
Abstract:
In this paper, we investigate the generalization performance of multi-class classification, for which we obtain a shaper error bound by using the notion of local Rademacher complexity and additional unlabeled samples, substantially improving the state-of-the-art bounds in existing multi-class learning methods. The statistical learning motivates us to devise an efficient multi-class learning framework with the local Rademacher complexity and Laplacian regularization. Coinciding with the theoretical analysis, experimental results demonstrate that the stated approach achieves better performance.
APA, Harvard, Vancouver, ISO, and other styles
5

Raymond, Christian, Qi Chen, Bing Xue, and Mengjie Zhang. "Genetic Programming with Rademacher Complexity for Symbolic Regression." In 2019 IEEE Congress on Evolutionary Computation (CEC). IEEE, 2019. http://dx.doi.org/10.1109/cec.2019.8790341.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Maximov, Yury, Massih-Reza Amini, and Zaid Harchaoui. "Rademacher Complexity Bounds for a Penalized Multi-class Semi-supervised Algorithm (Extended Abstract)." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/800.

Full text
Abstract:
We propose Rademacher complexity bounds for multi-class classifiers trained with a two-step semi-supervised model. In the first step, the algorithm partitions the partially labeled data and then identifies dense clusters containing k predominant classes using the labeled training examples such that the proportion of their non-predominant classes is below a fixed threshold stands for clustering consistency. In the second step, a classifier is trained by minimizing a margin empirical loss over the labeled training set and a penalization term measuring the disability of the learner to predict the
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Jianwei, Jiajia Zhou, and Xionglin Luo. "Multiple source domain adaptation: A sharper bound using weighted Rademacher complexity." In 2015 Conference on Technologies and Applications of Artificial Intelligence (TAAI). IEEE, 2015. http://dx.doi.org/10.1109/taai.2015.7407124.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Anguita, Davide, Alessandro Ghio, Luca Oneto, and Sandro Ridella. "Some results about the Vapnik-Chervonenkis entropy and the rademacher complexity." In 2013 International Joint Conference on Neural Networks (IJCNN 2013 - Dallas). IEEE, 2013. http://dx.doi.org/10.1109/ijcnn.2013.6706943.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhou, Mingyong. "A Theory on AI Uncertainty Based on Rademacher Complexity and Shannon Entropy." In 2020 IEEE 3rd International Conference of Safe Production and Informatization (IICSPI). IEEE, 2020. http://dx.doi.org/10.1109/iicspi51290.2020.9332424.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Giraldo, L. F., E. Delgado, and C. G. Castellanos. "Feature weighting and selection using a hybrid approach based on Rademacher complexity model selection." In 2007 34th Annual Computers in Cardiology Conference. IEEE, 2007. http://dx.doi.org/10.1109/cic.2007.4745470.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!