Literatura científica selecionada sobre o tema "Generalization bound"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Generalization bound".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Artigos de revistas sobre o assunto "Generalization bound"

1

Cohn, David, and Gerald Tesauro. "How Tight Are the Vapnik-Chervonenkis Bounds?" Neural Computation 4, no. 2 (1992): 249–69. http://dx.doi.org/10.1162/neco.1992.4.2.249.

Texto completo da fonte
Resumo:
We describe a series of numerical experiments that measure the average generalization capability of neural networks trained on a variety of simple functions. These experiments are designed to test the relationship between average generalization performance and the worst-case bounds obtained from formal learning theory using the Vapnik-Chervonenkis (VC) dimension (Blumer et al. 1989; Haussler et al. 1990). Recent statistical learning theories (Tishby et al. 1989; Schwartz et al. 1990) suggest that surpassing these bounds might be possible if the spectrum of possible generalizations has a “gap”
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Pereira, Rajesh, and Mohammad Ali Vali. "Generalizations of the Cauchy and Fujiwara Bounds for Products of Zeros of a Polynomial." Electronic Journal of Linear Algebra 31 (February 5, 2016): 565–71. http://dx.doi.org/10.13001/1081-3810.3333.

Texto completo da fonte
Resumo:
The Cauchy bound is one of the best known upper bounds for the modulus of the zeros of a polynomial. The Fujiwara bound is another useful upper bound for the modulus of the zeros of a polynomial. In this paper, compound matrices are used to derive a generalization of both the Cauchy bound and the Fujiwara bound. This generalization yields upper bounds for the modulus of the product of $m$ zeros of the polynomial.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Nedovic, M. "Norm bounds for the inverse for generalized Nekrasov matrices in point-wise and block case." Filomat 35, no. 8 (2021): 2705–14. http://dx.doi.org/10.2298/fil2108705n.

Texto completo da fonte
Resumo:
Lower-semi-Nekrasov matrices represent a generalization of Nekrasov matrices. For the inverse of lower-semi-Nekrasov matrices, a max-norm bound is proposed. Numerical examples are given to illustrate that new norm bound can give tighter results compared to already known bounds when applied to Nekrasov matrices. Also, we presented new max-norm bounds for the inverse of lower-semi-Nekrasov matrices in the block case. We considered two types of block generalizations and illustrated the results with numerical examples.
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Liu, Tongliang, Dacheng Tao, and Dong Xu. "Dimensionality-Dependent Generalization Bounds for k-Dimensional Coding Schemes." Neural Computation 28, no. 10 (2016): 2213–49. http://dx.doi.org/10.1162/neco_a_00872.

Texto completo da fonte
Resumo:
The k-dimensional coding schemes refer to a collection of methods that attempt to represent data using a set of representative k-dimensional vectors and include nonnegative matrix factorization, dictionary learning, sparse coding, k-means clustering, and vector quantization as special cases. Previous generalization bounds for the reconstruction error of the k-dimensional coding schemes are mainly dimensionality-independent. A major advantage of these bounds is that they can be used to analyze the generalization error when data are mapped into an infinite- or high-dimensional feature space. How
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Rubab, Faiza, Hira Nabi, and Asif R. Khan. "GENERALIZATION AND REFINEMENTS OF JENSEN INEQUALITY." Journal of Mathematical Analysis 12, no. 5 (2021): 1–27. http://dx.doi.org/10.54379/jma-2021-5-1.

Texto completo da fonte
Resumo:
We give generalizations and refinements of Jensen and Jensen− Mercer inequalities by using weights which satisfy the conditions of Jensen and Jensen− Steffensen inequalities. We also give some refinements for discrete and integral version of generalized Jensen−Mercer inequality and shown to be an improvement of the upper bound for the Jensen’s difference given in [32]. Applications of our work include new bounds for some important inequalities used in information theory, and generalizing the relations among means.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Nedovic, M., and Lj Cvetkovic. "Norm bounds for the inverse and error bounds for linear complementarity problems for {P1,P2}-Nekrasov matrices." Filomat 35, no. 1 (2021): 239–50. http://dx.doi.org/10.2298/fil2101239n.

Texto completo da fonte
Resumo:
{P1,P2}-Nekrasov matrices represent a generalization of Nekrasov matrices via permutations. In this paper, we obtained an error bound for linear complementarity problems for fP1; P2g-Nekrasov matrices. Numerical examples are given to illustrate that new error bound can give tighter results compared to already known bounds when applied to Nekrasov matrices. Also, we presented new max-norm bounds for the inverse of {P1,P2}-Nekrasov matrices in the block case, considering two different types of block generalizations. Numerical examples show that new norm bounds for the block case can give tighter
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

赵, 帆. "A Generalization of Hoffman’s Bound." Advances in Applied Mathematics 14, no. 03 (2025): 123–31. https://doi.org/10.12677/aam.2025.143098.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Han, Xinyu, Yi Zhao, and Michael Small. "A tighter generalization bound for reservoir computing." Chaos: An Interdisciplinary Journal of Nonlinear Science 32, no. 4 (2022): 043115. http://dx.doi.org/10.1063/5.0082258.

Texto completo da fonte
Resumo:
While reservoir computing (RC) has demonstrated astonishing performance in many practical scenarios, the understanding of its capability for generalization on previously unseen data is limited. To address this issue, we propose a novel generalization bound for RC based on the empirical Rademacher complexity under the probably approximately correct learning framework. Note that the generalization bound for the RC is derived in terms of the model hyperparameters. For this reason, it can explore the dependencies of the generalization bound for RC on its hyperparameters. Compared with the existing
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Gassner, Niklas, Marcus Greferath, Joachim Rosenthal, and Violetta Weger. "Bounds for Coding Theory over Rings." Entropy 24, no. 10 (2022): 1473. http://dx.doi.org/10.3390/e24101473.

Texto completo da fonte
Resumo:
Coding theory where the alphabet is identified with the elements of a ring or a module has become an important research topic over the last 30 years. It has been well established that, with the generalization of the algebraic structure to rings, there is a need to also generalize the underlying metric beyond the usual Hamming weight used in traditional coding theory over finite fields. This paper introduces a generalization of the weight introduced by Shi, Wu and Krotov, called overweight. Additionally, this weight can be seen as a generalization of the Lee weight on the integers modulo 4 and
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Chen, Jun, Hong Chen, Bin Gu, Guodong Liu, Yingjie Wang, and Weifu Li. "Error Analysis Affected by Heavy-Tailed Gradients for Non-Convex Pairwise Stochastic Gradient Descent." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 15 (2025): 15803–11. https://doi.org/10.1609/aaai.v39i15.33735.

Texto completo da fonte
Resumo:
In recent years, there have been a growing number of works studying the generalization properties of stochastic gradient descent (SGD) from the perspective of algorithmic stability. However, few of them devote to simultaneously studying the generalization and optimization for the non-convex setting, especially pairwise SGD with heavy-tailed gradient noise. This paper considers the impact of the heavy-tailed gradient noise obeying sub-Weibull distribution on the stability-based learning guarantees for non-convex pairwise SGD by investigating its generalization and optimization jointly. Specific
Estilos ABNT, Harvard, Vancouver, APA, etc.
Mais fontes

Teses / dissertações sobre o assunto "Generalization bound"

1

McDonald, Daniel J. "Generalization Error Bounds for Time Series." Research Showcase @ CMU, 2012. http://repository.cmu.edu/dissertations/184.

Texto completo da fonte
Resumo:
In this thesis, I derive generalization error bounds — bounds on the expected inaccuracy of the predictions — for time series forecasting models. These bounds allow forecasters to select among competing models, and to declare that, with high probability, their chosen model will perform well — without making strong assumptions about the data generating process or appealing to asymptotic theory. Expanding upon results from statistical learning theory, I demonstrate how these techniques can help time series forecasters to choose models which behave well under uncertainty. I also show how to estim
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Kroon, Rodney Stephen. "Support vector machines, generalization bounds, and transduction." Thesis, Stellenbosch : University of Stellenbosch, 2003. http://hdl.handle.net/10019.1/16375.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Kelby, Robin J. "Formalized Generalization Bounds for Perceptron-Like Algorithms." Ohio University / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1594805966855804.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Giulini, Ilaria. "Generalization bounds for random samples in Hilbert spaces." Thesis, Paris, Ecole normale supérieure, 2015. http://www.theses.fr/2015ENSU0026/document.

Texto completo da fonte
Resumo:
Ce travail de thèse porte sur l'obtention de bornes de généralisation pour des échantillons statistiques à valeur dans des espaces de Hilbert définis par des noyaux reproduisants. L'approche consiste à obtenir des bornes non asymptotiques indépendantes de la dimension dans des espaces de dimension finie, en utilisant des inégalités PAC-Bayesiennes liées à une perturbation Gaussienne du paramètre et à les étendre ensuite aux espaces de Hilbert séparables. On se pose dans un premier temps la question de l'estimation de l'opérateur de Gram à partir d'un échantillon i. i. d. par un estimateur robu
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Wade, Modou. "Apprentissage profond pour les processus faiblement dépendants." Electronic Thesis or Diss., CY Cergy Paris Université, 2024. http://www.theses.fr/2024CYUN1299.

Texto completo da fonte
Resumo:
Cette thèse porte sur l'apprentissage profond pour les processus faiblement dépendants. Nous avons considéré une classe d'estimateur de réseau de neurones profonds avec la régularisation par sparsité et/ou la régularisation par pénalité.Le chapitre1 est une synthèse des travaux. Il s'agit ici de présenter le cadre de l'apprentissage profond et de rappeler les principaux résultats obtenus aux chapitres 2, 3, 4, 5, 6.Le chapitre 2 considère l'apprentissage profond pour les processus psi-faiblement dépendants. Nous avons établi une vitesse de convergence de l'algorithme de minimisation du risque
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Rakhlin, Alexander. "Applications of empirical processes in learning theory : algorithmic stability and generalization bounds." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/34564.

Texto completo da fonte
Resumo:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Brain and Cognitive Sciences, 2006.<br>Includes bibliographical references (p. 141-148).<br>This thesis studies two key properties of learning algorithms: their generalization ability and their stability with respect to perturbations. To analyze these properties, we focus on concentration inequalities and tools from empirical process theory. We obtain theoretical results and demonstrate their applications to machine learning. First, we show how various notions of stability upper- and lower-bound the bias and variance of several e
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Bellet, Aurélien. "Supervised metric learning with generalization guarantees." Phd thesis, Université Jean Monnet - Saint-Etienne, 2012. http://tel.archives-ouvertes.fr/tel-00770627.

Texto completo da fonte
Resumo:
In recent years, the crucial importance of metrics in machine learningalgorithms has led to an increasing interest in optimizing distanceand similarity functions using knowledge from training data to make them suitable for the problem at hand.This area of research is known as metric learning. Existing methods typically aim at optimizing the parameters of a given metric with respect to some local constraints over the training sample. The learned metrics are generally used in nearest-neighbor and clustering algorithms.When data consist of feature vectors, a large body of work has focused on lear
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Nordenfors, Oskar. "A Literature Study Concerning Generalization Error Bounds for Neural Networks via Rademacher Complexity." Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-184487.

Texto completo da fonte
Resumo:
In this essay some fundamental results from the theory of machine learning and neural networks are presented, with the goal of finally discussing bounds on the generalization error of neural networks, via Rademacher complexity.<br>I denna uppsats presenteras några grundläggande resultat från teorin kring maskininlärning och neurala nätverk, med målet att slutligen diskutera övre begräsningar på generaliseringsfelet hos neurala nätverk, via Rademachers komplexitet.
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Katsikarelis, Ioannis. "Structurally Parameterized Tight Bounds and Approximation for Generalizations of Independence and Domination." Thesis, Paris Sciences et Lettres (ComUE), 2019. http://www.theses.fr/2019PSLED048.

Texto completo da fonte
Resumo:
Nous nous concentrons sur les problèmes (k, r)-CENTER et d-SCATTERED SET qui généralisent les concepts de domination et indépendance des sommets, sur les distances plus grandes.Dans la première partie, nous examinons le paramétrage standard, ainsi que les paramètres des graphes mesurant la structure de l’entrée. Nous proposons des résultats qui montrent qu’il n’existe pas d’algorithme avec un temps d’exécution inférieur à certaines limites, si l’hypothèse du temps exponentiel est vraie, nous produisons des algorithmes de complexité essentiellement optimale qui correspondent à ces limites et no
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Musayeva, Khadija. "Generalization Performance of Margin Multi-category Classifiers." Thesis, Université de Lorraine, 2019. http://www.theses.fr/2019LORR0096/document.

Texto completo da fonte
Resumo:
Cette thèse porte sur la théorie de la discrimination multi-classe à marge. Elle a pour cadre la théorie statistique de l’apprentissage de Vapnik et Chervonenkis. L’objectif est d’établir des bornes de généralisation possédant une dépendances explicite au nombre C de catégories, à la taille m de l’échantillon et au paramètre de marge gamma, lorsque la fonction de perte considérée est une fonction de perte à marge possédant la propriété d’être lipschitzienne. La borne de généralisation repose sur la performance empirique du classifieur ainsi que sur sa "capacité". Dans cette thèse, les mesures
Estilos ABNT, Harvard, Vancouver, APA, etc.
Mais fontes

Livros sobre o assunto "Generalization bound"

1

I, Arnolʹd V. Experimental mathematics. MSRI Mathematical Sciences Research Institute, 2015.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Espiritu, Yen Le. Race and U.S. Panethnic Formation. Edited by Ronald H. Bayor. Oxford University Press, 2014. http://dx.doi.org/10.1093/oxfordhb/9780199766031.013.013.

Texto completo da fonte
Resumo:
Panethnicity refers to the development of bridging organizations and the generalization of solidarity among subgroups that are racialized to be homogeneous by outsiders. This chapter argues that while the formation of a consolidated white identity in the United States is self-motivated and linked to white privilege, panethnicity for people of color is a product of racial categorization and bound up with power relations. As the influx of new immigrants transforms the demographic composition of existing groups such as Asian Americans and Latinos, group members face the challenge of bridging the
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Horing, Norman J. Morgenstern. Superfluidity and Superconductivity. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198791942.003.0013.

Texto completo da fonte
Resumo:
Chapter 13 addresses Bose condensation in superfluids (and superconductors), which involves the field operator ψ‎ having a c-number component (&lt;ψ(x,t)&gt;≠0), challenging number conservation. The nonlinear Gross-Pitaevskii equation is derived for this condensate wave function&lt;ψ&gt;=ψ−ψ˜, facilitating identification of the coherence length and the core region of vortex motion. The noncondensate Green’s function G˜1(1,1′)=−i&lt;(ψ˜(1)ψ˜+(1′))+&gt; and the nonvanishing anomalous correlation function F˜∗(2,1′)=−i&lt;(ψ˜+(2)ψ˜+(1′))+&gt; describe the dynamics and elementary excitations of the
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Hecht, Richard D., and Vincent F. Biondo, eds. Religion and Everyday Life and Culture. ABC-CLIO, LLC, 2010. http://dx.doi.org/10.5040/9798216006909.

Texto completo da fonte
Resumo:
This intriguing three-volume set explores the ways in which religion is bound to the practice of daily life and how daily life is bound to religion. InReligion and Everyday Life and Culture, 36 international scholars describe the impact of religious practices around the world, using rich examples drawn from personal observation. Instead of repeating generalizations about what religionshouldmean, these volumes examine how religions actually influence our public and private lives "on the ground," on a day-to-day basis. Volume one introduces regional histories of the world's religions and discuss
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Mandelkern, Matthew. Bounded Meaning. Oxford University PressOxford, 2024. http://dx.doi.org/10.1093/oso/9780192870049.001.0001.

Texto completo da fonte
Resumo:
Abstract ‘Bounded Meaning’ is a monograph on the dynamics of interpretation: how and why the interpretation of the building blocks of human language is sensitive, not just to the context in which the expression is used, but also to the expression’s linguistic environment---in other words, how and why interpretation depends not just on global information, but also on local information. The book motivates a range of generalizations about the dynamics of interpretation, some known and some novel, involving modals, conditionals, and anaphora. It then provides an overview of the best extant theory
Estilos ABNT, Harvard, Vancouver, APA, etc.

Capítulos de livros sobre o assunto "Generalization bound"

1

Burnaev, Evgeny. "Generalization Bound for Imbalanced Classification." In Recent Developments in Stochastic Methods and Applications. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-83266-7_8.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Baader, Franz. "Unification, weak unification, upper bound, lower bound, and generalization problems." In Rewriting Techniques and Applications. Springer Berlin Heidelberg, 1991. http://dx.doi.org/10.1007/3-540-53904-2_88.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Fukuchi, Kazuto, and Jun Sakuma. "Neutralized Empirical Risk Minimization with Generalization Neutrality Bound." In Machine Learning and Knowledge Discovery in Databases. Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-662-44848-9_27.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Tang, Li, Zheng Zhao, Xiujun Gong, and Huapeng Zeng. "On the Generalization of PAC-Bayes Bound for SVM Linear Classifier." In Communications in Computer and Information Science. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-34447-3_16.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Wang, Mingda, Canqian Yang, and Yi Xu. "Posterior Refinement on Metric Matrix Improves Generalization Bound in Metric Learning." In Lecture Notes in Computer Science. Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-19809-0_12.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Levinson, Norman. "Generalization of Recent Method Giving Lower Bound for N 0(T) of Riemann’s Zeta-Function." In Selected Papers of Norman Levinson. Birkhäuser Boston, 1998. http://dx.doi.org/10.1007/978-1-4612-5335-8_32.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Zhang, Xinhua, Novi Quadrianto, Kristian Kersting, et al. "Generalization Bounds." In Encyclopedia of Machine Learning. Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_328.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Reid, Mark. "Generalization Bounds." In Encyclopedia of Machine Learning and Data Mining. Springer US, 2017. http://dx.doi.org/10.1007/978-1-4899-7687-1_328.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Helleseth, Tor, Torleiv Kløve, and Øyvind Ytrehus. "Generalizations of the Griesmer bound." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 1994. http://dx.doi.org/10.1007/3-540-58265-7_6.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Rejchel, W. "Generalization Bounds for Ranking Algorithms." In Ensemble Classification Methods with Applicationsin R. John Wiley & Sons, Ltd, 2018. http://dx.doi.org/10.1002/9781119421566.ch7.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Trabalhos de conferências sobre o assunto "Generalization bound"

1

He, Haiyun, Christina Lee Yu, and Ziv Goldfeld. "Hierarchical Generalization Bounds for Deep Neural Networks." In 2024 IEEE International Symposium on Information Theory (ISIT). IEEE, 2024. http://dx.doi.org/10.1109/isit57864.2024.10619279.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Sefidgaran, Milad, and Abdellatif Zaidi. "Data-Dependent Generalization Bounds via Variable-Size Compressibility." In 2024 IEEE International Symposium on Information Theory (ISIT). IEEE, 2024. http://dx.doi.org/10.1109/isit57864.2024.10619654.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Rodríguez-Gálvez, Borja, Omar Rivasplata, Ragnar Thobaben, and Mikael Skoglund. "A Note on Generalization Bounds for Losses with Finite Moments." In 2024 IEEE International Symposium on Information Theory (ISIT). IEEE, 2024. http://dx.doi.org/10.1109/isit57864.2024.10619194.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Zhu, Bowei, Shaojie Li, and Yong Liu. "Towards Sharper Risk Bounds for Minimax Problems." In Thirty-Third International Joint Conference on Artificial Intelligence {IJCAI-24}. International Joint Conferences on Artificial Intelligence Organization, 2024. http://dx.doi.org/10.24963/ijcai.2024/630.

Texto completo da fonte
Resumo:
Minimax problems have achieved success in machine learning such as adversarial training, robust optimization, reinforcement learning. For theoretical analysis, current optimal excess risk bounds, which are composed by generalization error and optimization error, present 1/n-rates in strongly-convex-strongly-concave (SC-SC) settings. Existing studies mainly focus on minimax problems with specific algorithms for optimization error, with only a few studies on generalization performance, which limit better excess risk bounds. In this paper, we study the generalization bounds measured by the gradie
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Wen, Wen, Han Li, Tieliang Gong, and Hong Chen. "Towards Sharper Generalization Bounds for Adversarial Contrastive Learning." In Thirty-Third International Joint Conference on Artificial Intelligence {IJCAI-24}. International Joint Conferences on Artificial Intelligence Organization, 2024. http://dx.doi.org/10.24963/ijcai.2024/574.

Texto completo da fonte
Resumo:
Recently, the enhancement on the adversarial robustness of machine learning algorithms has gained significant attention across various application domains. Given the widespread label scarcity issue in real-world data, adversarial contrastive learning (ACL) has been proposed to adversarially train robust models using unlabeled data. Despite the empirical success, its generalization behavior remains poorly understood and far from being well-characterized. This paper aims to address this issue from a learning theory perspective. We establish novel high-probability generalization bounds for the ge
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Luo, Jin, Yongguang Chen, and Xuejun Zhou. "Generalization Bound for Multi-Classification with Push." In 2010 International Conference on Artificial Intelligence and Computational Intelligence (AICI). IEEE, 2010. http://dx.doi.org/10.1109/aici.2010.201.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Chen, Hao, Zhanfeng Mo, Zhouwang Yang, and Xiao Wang. "Theoretical Investigation of Generalization Bound for Residual Networks." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/288.

Texto completo da fonte
Resumo:
This paper presents a framework for norm-based capacity control with respect to an lp,q-norm in weight-normalized Residual Neural Networks (ResNets). We first formulate the representation of each residual block. For the regression problem, we analyze the Rademacher Complexity of the ResNets family. We also establish a tighter generalization upper bound for weight-normalized ResNets. in a more general sight. Using the lp,q-norm weight normalization in which 1/p+1/q &gt;=1, we discuss the properties of a width-independent capacity control, which only relies on the depth according to a square roo
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Zhou, Ruida, Chao Tian, and Tie Liu. "Individually Conditional Individual Mutual Information Bound on Generalization Error." In 2021 IEEE International Symposium on Information Theory (ISIT). IEEE, 2021. http://dx.doi.org/10.1109/isit45174.2021.9518016.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Rezazadeh, Arezou, Sharu Theresa Jose, Giuseppe Durisi, and Osvaldo Simeone. "Conditional Mutual Information-Based Generalization Bound for Meta Learning." In 2021 IEEE International Symposium on Information Theory (ISIT). IEEE, 2021. http://dx.doi.org/10.1109/isit45174.2021.9518020.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Uchida, Masato. "Tight lower bound of generalization error in ensemble learning." In 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems (SCIS) and 15th International Symposium on Advanced Intelligent Systems (ISIS). IEEE, 2014. http://dx.doi.org/10.1109/scis-isis.2014.7044723.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Relatórios de organizações sobre o assunto "Generalization bound"

1

Dhankhar, Ritu, and Prasanna Kumar. A Remark on a Generalization of the Cauchy’s Bound. "Prof. Marin Drinov" Publishing House of Bulgarian Academy of Sciences, 2020. http://dx.doi.org/10.7546/crabs.2020.10.01.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Zarrieß, Benjamin, and Anni-Yasmin Turhan. Most Specific Generalizations w.r.t. General EL-TBoxes. Technische Universität Dresden, 2013. http://dx.doi.org/10.25368/2022.196.

Texto completo da fonte
Resumo:
In the area of Description Logics the least common subsumer (lcs) and the most specific concept (msc) are inferences that generalize a set of concepts or an individual, respectively, into a single concept. If computed w.r.t. a general EL-TBox neither the lcs nor the msc need to exist. So far in this setting no exact conditions for the existence of lcs- or msc-concepts are known. This report provides necessary and suffcient conditions for the existence of these two kinds of concepts. For the lcs of a fixed number of concepts and the msc we show decidability of the existence in PTime and polynom
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!