Academic literature on the topic 'Constraint networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Constraint networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Constraint networks"

1

Brosowsky, Mathis, Florian Keck, Olaf Dünkel, and Marius Zöllner. "Sample-Specific Output Constraints for Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 8 (2021): 6812–21. http://dx.doi.org/10.1609/aaai.v35i8.16841.

Full text
Abstract:
It is common practice to constrain the output space of a neural network with the final layer to a problem-specific value range. However, for many tasks it is desired to restrict the output space for each input independently to a different subdomain with a non-trivial geometry, e.g. in safety-critical applications, to exclude hazardous outputs sample-wise. We propose ConstraintNet—a scalable neural network architecture which constrains the output space in each forward pass independently. Contrary to prior approaches, which perform a projection in the final layer, ConstraintNet applies an input-dependent parametrization of the constrained output space. Thereby, the complete interior of the constrained region is covered and computational costs are reduced significantly. For constraints in form of convex polytopes, we leverage the vertex representation to specify the parametrization. The second modification consists of adding an auxiliary input in form of a tensor description of the constraint to enable the handling of multiple constraints for the same sample. Finally, ConstraintNet is end-to-end trainable with almost no overhead in the forward and backward pass. We demonstrate ConstraintNet on two regression tasks: First, we modify a CNN and construct several constraints for facial landmark detection tasks. Second, we demonstrate the application to a follow object controller for vehicles and accomplish safe reinforcement learning in this case. In both experiments, ConstraintNet improves performance and we conclude that our approach is promising for applying neural networks in safety-critical environments.
APA, Harvard, Vancouver, ISO, and other styles
2

Rong, Zihao, Shaofan Wang, Dehui Kong, and Baocai Yin. "Improving object detection quality with structural constraints." PLOS ONE 17, no. 5 (2022): e0267863. http://dx.doi.org/10.1371/journal.pone.0267863.

Full text
Abstract:
Recent researches revealed object detection networks using the simple “classification loss + localization loss” training objective are not effectively optimized in many cases, while providing additional constraints on network features could effectively improve object detection quality. Specifically, some works used constraints on training sample relations to successfully learn discriminative network features. Based on these observations, we propose Structural Constraint for improving object detection quality. Structural constraint supervises feature learning in classification and localization network branches with Fisher Loss and Equi-proportion Loss respectively, by requiring feature similarities of training sample pairs to be consistent with corresponding ground truth label similarities. Structural constraint could be applied to all object detection network architectures with the assist of our Proxy Feature design. Our experiment results showed that structural constraint mechanism is able to optimize object class instances’ distribution in network feature space, and consequently detection results. Evaluations on MSCOCO2017 and KITTI datasets showed that our structural constraint mechanism is able to assist baseline networks to outperform modern counterpart detectors in terms of object detection quality.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Y., and R. H. C. Yap. "Set Intersection and Consistency in Constraint Networks." Journal of Artificial Intelligence Research 27 (December 13, 2006): 441–64. http://dx.doi.org/10.1613/jair.2058.

Full text
Abstract:
In this paper, we show that there is a close relation between consistency in a constraint network and set intersection. A proof schema is provided as a generic way to obtain consistency properties from properties on set intersection. This approach not only simplifies the understanding of and unifies many existing consistency results, but also directs the study of consistency to that of set intersection properties in many situations, as demonstrated by the results on the convexity and tightness of constraints in this paper. Specifically, we identify a new class of tree convex constraints where local consistency ensures global consistency. This generalizes row convex constraints. Various consistency results are also obtained on constraint networks where only some, in contrast to all in the existing work,constraints are tight.
APA, Harvard, Vancouver, ISO, and other styles
4

Kharroubi, Idris, Thomas Lim, and Xavier Warin. "Discretization and machine learning approximation of BSDEs with a constraint on the Gains-process." Monte Carlo Methods and Applications 27, no. 1 (2021): 27–55. http://dx.doi.org/10.1515/mcma-2020-2080.

Full text
Abstract:
Abstract We study the approximation of backward stochastic differential equations (BSDEs for short) with a constraint on the gains process. We first discretize the constraint by applying a so-called facelift operator at times of a grid. We show that this discretely constrained BSDE converges to the continuously constrained one as the mesh grid converges to zero. We then focus on the approximation of the discretely constrained BSDE. For that we adopt a machine learning approach. We show that the facelift can be approximated by an optimization problem over a class of neural networks under constraints on the neural network and its derivative. We then derive an algorithm converging to the discretely constrained BSDE as the number of neurons goes to infinity. We end by numerical experiments.
APA, Harvard, Vancouver, ISO, and other styles
5

Dechter, Rina, Itay Meiri, and Judea Pearl. "Temporal constraint networks." Artificial Intelligence 49, no. 1-3 (1991): 61–95. http://dx.doi.org/10.1016/0004-3702(91)90006-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Msaaf, Mohammed, and Fouad Belmajdoub. "Diagnosis of Discrete Event Systems under Temporal Constraints Using Neural Network." International Journal of Engineering Research in Africa 49 (June 2020): 198–205. http://dx.doi.org/10.4028/www.scientific.net/jera.49.198.

Full text
Abstract:
The good functioning of a discrete event system is related to how much the temporal constraints are respected. This paper gives a new approach, based on a statistical model and neural network, that allows the verification of temporal constraints in DES. We will perform an online temporal constraint checking which can detect in real time any abnormal functioning related to the violation of a temporal constraint. In the first phase, the construction of temporal constraints from statistical model is shown and after that neural networks are involved in dealing with the online temporal constraint checking.
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Xiao Fei, Xi Zhang, Yue Bing Chen, Lei Zhang, and Chao Jing Tang. "Spectrum Assignment Algorithm Based on Clonal Selection in Cognitive Radio Networks." Advanced Materials Research 457-458 (January 2012): 931–39. http://dx.doi.org/10.4028/www.scientific.net/amr.457-458.931.

Full text
Abstract:
An improved-immune-clonal-selection based spectrum assignment algorithm (IICSA) in cognitive radio networks is proposed, combing graph theory and immune optimization. It uses constraint satisfaction operation to make encoded antibody population satisfy constraints, and realizes the global optimization. The random-constraint satisfaction operator and fair-constraint satisfaction operator are designed to guarantee efficiency and fairness, respectively. Simulations are performed for performance comparison between the IICSA and the color-sensitive graph coloring algorithm. The results indicate that the proposed algorithm increases network utilization, and efficiently improves the fairness.
APA, Harvard, Vancouver, ISO, and other styles
8

Buscema, Massimo. "Constraint Satisfaction Neural Networks." Substance Use & Misuse 33, no. 2 (1998): 389–408. http://dx.doi.org/10.3109/10826089809115873.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Suter, D. "Constraint networks in vision." IEEE Transactions on Computers 40, no. 12 (1991): 1359–67. http://dx.doi.org/10.1109/12.106221.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gottlob, Georg. "On minimal constraint networks." Artificial Intelligence 191-192 (November 2012): 42–60. http://dx.doi.org/10.1016/j.artint.2012.07.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography