Academic literature on the topic 'Lipschitz neural network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Lipschitz neural network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Lipschitz neural network"

1

Zhu, Zelong, Chunna Zhao, and Yaqun Huang. "Fractional order Lipschitz recurrent neural network with attention for long time series prediction." Journal of Physics: Conference Series 2813, no. 1 (2024): 012015. http://dx.doi.org/10.1088/1742-6596/2813/1/012015.

Full text
Abstract:
Abstract Time series data prediction holds a significant importance in various applications. In this study, we specifically concentrate on long-time series data prediction. Recurrent Neural Networks are widely recognized as a fundamental neural network architecture for processing effectively time-series data. Recurrent Neural Network models encounter the gradient disappearance or gradient explosion challenge in long series data. To resolve the gradient problem and improve accuracy, the Fractional Order Lipschitz Recurrent Neural Network (FOLRNN) model is proposed to predict long time series in
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Huan, Pengchuan Zhang, and Cho-Jui Hsieh. "RecurJac: An Efficient Recursive Algorithm for Bounding Jacobian Matrix of Neural Networks and Its Applications." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5757–64. http://dx.doi.org/10.1609/aaai.v33i01.33015757.

Full text
Abstract:
The Jacobian matrix (or the gradient for single-output networks) is directly related to many important properties of neural networks, such as the function landscape, stationary points, (local) Lipschitz constants and robustness to adversarial attacks. In this paper, we propose a recursive algorithm, RecurJac, to compute both upper and lower bounds for each element in the Jacobian matrix of a neural network with respect to network’s input, and the network can contain a wide range of activation functions. As a byproduct, we can efficiently obtain a (local) Lipschitz constant, which plays a cruci
APA, Harvard, Vancouver, ISO, and other styles
3

Araujo, Alexandre, Benjamin Negrevergne, Yann Chevaleyre, and Jamal Atif. "On Lipschitz Regularization of Convolutional Layers using Toeplitz Matrix Theory." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 8 (2021): 6661–69. http://dx.doi.org/10.1609/aaai.v35i8.16824.

Full text
Abstract:
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschitz regularity is now established as a key property of modern deep learning with implications in training stability, generalization, robustness against adversarial examples, etc. However, computing the exact value of the Lipschitz constant of a neural network is known to be NP-hard. Recent attempts from the literature introduce upper bounds to approximate this constant that are either efficient but loose or accurate but computationally expensive. In this work, by leveraging the theory of Toeplitz
APA, Harvard, Vancouver, ISO, and other styles
4

Xu, Yuhui, Wenrui Dai, Yingyong Qi, Junni Zou, and Hongkai Xiong. "Iterative Deep Neural Network Quantization With Lipschitz Constraint." IEEE Transactions on Multimedia 22, no. 7 (2020): 1874–88. http://dx.doi.org/10.1109/tmm.2019.2949857.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mohammad, Ibtihal J. "Neural Networks of the Rational r-th Powers of the Multivariate Bernstein Operators." BASRA JOURNAL OF SCIENCE 40, no. 2 (2022): 258–73. http://dx.doi.org/10.29072/basjs.20220201.

Full text
Abstract:
In this study, a novel neural network for the multivariance Bernstein operators' rational powers was developed. A positive integer is required by these networks. In the space of all real-valued continuous functions, the pointwise and uniform approximation theorems are introduced and examined first. After that, the Lipschitz space is used to study two key theorems. Additionally, some numerical examples are provided to demonstrate how well these neural networks approximate two test functions. The numerical outcomes demonstrate that as input grows, the neural network provides a better approximati
APA, Harvard, Vancouver, ISO, and other styles
6

Ibtihal.J.M and Ali J. Mohammad. "Neural Network of Multivariate Square Rational Bernstein Operators with Positive Integer Parameter." European Journal of Pure and Applied Mathematics 15, no. 3 (2022): 1189–200. http://dx.doi.org/10.29020/nybg.ejpam.v15i3.4425.

Full text
Abstract:
This research is defined a new neural network (NN) that depends upon a positive integer parameter using the multivariate square rational Bernstein polynomials. Some theorems for this network are proved, such as the pointwise and the uniform approximation theorems. Firstly, the absolute moment for a function that belongs to Lipschitz space is defined to estimate the order of the NN. Secondly, some numerical applications for this NN are given by taking two test functions. Finally, the numerical results for this network are compared with the classical neural networks (NNs). The results turn out t
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Kanglin, and Guoping Qiu. "Lipschitz constrained GANs via boundedness and continuity." Neural Computing and Applications 32, no. 24 (2020): 18271–83. http://dx.doi.org/10.1007/s00521-020-04954-z.

Full text
Abstract:
AbstractOne of the challenges in the study of generative adversarial networks (GANs) is the difficulty of its performance control. Lipschitz constraint is essential in guaranteeing training stability for GANs. Although heuristic methods such as weight clipping, gradient penalty and spectral normalization have been proposed to enforce Lipschitz constraint, it is still difficult to achieve a solution that is both practically effective and theoretically provably satisfying a Lipschitz constraint. In this paper, we introduce the boundedness and continuity (BC) conditions to enforce the Lipschitz c
APA, Harvard, Vancouver, ISO, and other styles
8

Othmani, S., N. E. Tatar, and A. Khemmoudj. "Asymptotic behavior of a BAM neural network with delays of distributed type." Mathematical Modelling of Natural Phenomena 16 (2021): 29. http://dx.doi.org/10.1051/mmnp/2021023.

Full text
Abstract:
In this paper, we examine a Bidirectional Associative Memory neural network model with distributed delays. Using a result due to Cid [J. Math. Anal. Appl. 281 (2003) 264–275], we were able to prove an exponential stability result in the case when the standard Lipschitz continuity condition is violated. Indeed, we deal with activation functions which may not be Lipschitz continuous. Therefore, the standard Halanay inequality is not applicable. We will use a nonlinear version of this inequality. At the end, the obtained differential inequality which should imply the exponential stability appears
APA, Harvard, Vancouver, ISO, and other styles
9

Xia, Youshen. "An Extended Projection Neural Network for Constrained Optimization." Neural Computation 16, no. 4 (2004): 863–83. http://dx.doi.org/10.1162/089976604322860730.

Full text
Abstract:
Recently, a projection neural network has been shown to be a promising computational model for solving variational inequality problems with box constraints. This letter presents an extended projection neural network for solving monotone variational inequality problems with linear and nonlinear constraints. In particular, the proposed neural network can include the projection neural network as a special case. Compared with the modified projection-type methods for solving constrained monotone variational inequality problems, the proposed neural network has a lower complexity and is suitable for
APA, Harvard, Vancouver, ISO, and other styles
10

Li, Peiluan, Yuejing Lu, Changjin Xu, and Jing Ren. "Bifurcation Phenomenon and Control Technique in Fractional BAM Neural Network Models Concerning Delays." Fractal and Fractional 7, no. 1 (2022): 7. http://dx.doi.org/10.3390/fractalfract7010007.

Full text
Abstract:
In this current study, we formulate a kind of new fractional BAM neural network model concerning five neurons and time delays. First, we explore the existence and uniqueness of the solution of the formulated fractional delay BAM neural network models via the Lipschitz condition. Second, we study the boundedness of the solution to the formulated fractional delayed BAM neural network models using a proper function. Third, we set up a novel sufficient criterion on the onset of the Hopf bifurcation stability of the formulated fractional BAM neural network models by virtue of the stability criterio
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!