To see the other types of publications on this topic, follow the link: Network tuning.

Journal articles on the topic 'Network tuning'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Network tuning.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Wani, M. Arif, and Saduf Afzal. "Optimization of deep network models through fine tuning." International Journal of Intelligent Computing and Cybernetics 11, no. 3 (2018): 386–403. http://dx.doi.org/10.1108/ijicc-06-2017-0070.

Full text
Abstract:
Purpose Many strategies have been put forward for training deep network models, however, stacking of several layers of non-linearities typically results in poor propagation of gradients and activations. The purpose of this paper is to explore the use of two steps strategy where initial deep learning model is obtained first by unsupervised learning and then optimizing the initial deep learning model by fine tuning. A number of fine tuning algorithms are explored in this work for optimizing deep learning models. This includes proposing a new algorithm where Backpropagation with adaptive gain alg
APA, Harvard, Vancouver, ISO, and other styles
2

Кonarev, D., and А. Gulamov. "ACCURACY IMPROVING OF PRE-TRAINED NEURAL NETWORKS BY FINE TUNING." EurasianUnionScientists 5, no. 1(82) (2021): 26–28. http://dx.doi.org/10.31618/esu.2413-9335.2021.5.82.1231.

Full text
Abstract:
Methods of accuracy improving of pre-trained networks are discussed. Images of ships are input data for the networks. Networks are built and trained using Keras and TensorFlow machine learning libraries. Fine tuning of previously trained convoluted artificial neural networks for pattern recognition tasks is described. Fine tuning of VGG16 and VGG19 networks are done by using Keras Applications. The accuracy of VGG16 network with finetuning of the last convolution unit increased from 94.38% to 95.21%. An increase is only 0.83%. The accuracy of VGG19 network with fine-tuning of the last convolut
APA, Harvard, Vancouver, ISO, and other styles
3

Machens, Christian K., and Carlos D. Brody. "Design of Continuous Attractor Networks with Monotonic Tuning Using a Symmetry Principle." Neural Computation 20, no. 2 (2008): 452–85. http://dx.doi.org/10.1162/neco.2007.07-06-297.

Full text
Abstract:
Neurons that sustain elevated firing in the absence of stimuli have been found in many neural systems. In graded persistent activity, neurons can sustain firing at many levels, suggesting a widely found type of network dynamics in which networks can relax to any one of a continuum of stationary states. The reproduction of these findings in model networks of nonlinear neurons has turned out to be nontrivial. A particularly insightful model has been the “bump attractor,” in which a continuous attractor emerges through an underlying symmetry in the network connectivity matrix. This model, however
APA, Harvard, Vancouver, ISO, and other styles
4

Ge, Changhan, Ajay Mahimkar, Zihui Ge, et al. "Iridescence: Improving Configuration Tuning in the Presence of Confounders for 5G NSA Networks." Proceedings of the ACM on Networking 3, CoNEXT1 (2025): 1–22. https://doi.org/10.1145/3709378.

Full text
Abstract:
Configuration tuning is one of the top network operational tasks for Cellular Service Providers (CSPs), and is typically done to either restore performance during degraded network conditions such as congestion, failure, planned upgrades, or optimize service performance through change trials. A long-standing challenge in tuning has been to associate a causal relationship between a configuration change and a service performance impact. Confounders (or, external factors) make this extremely hard. In this paper, we focus on improving configuration tuning in the presence of confounders for 5G Non-s
APA, Harvard, Vancouver, ISO, and other styles
5

Menapace, Andrea, Ariele Zanfei, and Maurizio Righetti. "Tuning ANN Hyperparameters for Forecasting Drinking Water Demand." Applied Sciences 11, no. 9 (2021): 4290. http://dx.doi.org/10.3390/app11094290.

Full text
Abstract:
The evolution of smart water grids leads to new Big Data challenges boosting the development and application of Machine Learning techniques to support efficient and sustainable drinking water management. These powerful techniques rely on hyperparameters making the models’ tuning a tricky and crucial task. We hence propose an insightful analysis of the tuning of Artificial Neural Networks for drinking water demand forecasting. This study focuses on layers and nodes’ hyperparameters fitting of different Neural Network architectures through a grid search method by varying dataset, prediction hori
APA, Harvard, Vancouver, ISO, and other styles
6

Freitag, J., N. L. S. da Fonseca, and J. F. de Rezende. "Tuning of 802.11e network parameters." IEEE Communications Letters 10, no. 8 (2006): 611–13. http://dx.doi.org/10.1109/lcomm.2006.1665127.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Junhong, and Jouni Lampinen. "Approximation by Growing Radial Basis Function Networks Using the Differential-Evolution-Based Algorithm." Journal of Advanced Computational Intelligence and Intelligent Informatics 9, no. 5 (2005): 540–48. http://dx.doi.org/10.20965/jaciii.2005.p0540.

Full text
Abstract:
The differential evolution (DE) algorithm is a floating-point-encoded evolutionary algorithm for global optimization. We applied a DE-based method to training radial basis function (RBF) networks with variables including centers, weights, and widths. This algorithm consists of three steps – initial tuning focusing on finding the center of a one-node RBF network, local tuning, and global tuning both using cycling schemes to find RBF network parameters. The mean square error from desired output to actual network output is applied as the objective function to be minimized. Network training is sho
APA, Harvard, Vancouver, ISO, and other styles
8

Li, Tianxi, Elizaveta Levina, and Ji Zhu. "Network cross-validation by edge sampling." Biometrika 107, no. 2 (2020): 257–76. http://dx.doi.org/10.1093/biomet/asaa006.

Full text
Abstract:
Summary While many statistical models and methods are now available for network analysis, resampling of network data remains a challenging problem. Cross-validation is a useful general tool for model selection and parameter tuning, but it is not directly applicable to networks since splitting network nodes into groups requires deleting edges and destroys some of the network structure. In this paper we propose a new network resampling strategy, based on splitting node pairs rather than nodes, that is applicable to cross-validation for a wide range of network model selection tasks. We provide th
APA, Harvard, Vancouver, ISO, and other styles
9

Polyakova, M. V. "RCF-ST: RICHER CONVOLUTIONAL FEATURES NETWORK WITH STRUCTURAL TUNING FOR THE EDGE DETECTION ON NATURAL IMAGES." Radio Electronics, Computer Science, Control, no. 4 (January 4, 2024): 122. http://dx.doi.org/10.15588/1607-3274-2023-4-12.

Full text
Abstract:
Context. The problem of automating of the edge detection on natural images in intelligent systems is considered. The subject of the research is the deep learning convolutional neural networks for edge detection on natural images.
 Objective. The objective of the research is to improve the edge detection performance of natural images by structural tuning the richer convolutional features network architecture.
 Method. In general, the edge detection performance is influenced by a neural network architecture. To automate the design of the network structure in the paper a structural tuni
APA, Harvard, Vancouver, ISO, and other styles
10

Ha, Seokhyeon, Sunbeom Jeong, and Jungwoo Lee. "Domain-Aware Fine-Tuning: Enhancing Neural Network Adaptability." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 11 (2024): 12261–69. http://dx.doi.org/10.1609/aaai.v38i11.29116.

Full text
Abstract:
Fine-tuning pre-trained neural network models has become a widely adopted approach across various domains. However, it can lead to the distortion of pre-trained feature extractors that already possess strong generalization capabilities. Mitigating feature distortion during adaptation to new target domains is crucial. Recent studies have shown promising results in handling feature distortion by aligning the head layer on in-distribution datasets before performing fine-tuning. Nonetheless, a significant limitation arises from the treatment of batch normalization layers during fine-tuning, leadin
APA, Harvard, Vancouver, ISO, and other styles
11

Zhuo, Pin. "Optimization of Intelligent Tuning System for Stringed Instruments Based on Wireless Sensor Network." Journal of Sensors 2021 (November 9, 2021): 1–10. http://dx.doi.org/10.1155/2021/6538185.

Full text
Abstract:
With the increasing abundance of material civilization, people’s pursuit of spiritual civilization is also higher and higher. Although tuning is the analysis and processing of sound signal, the string signal is also a kind of sound signal, which follows the basic law of acoustics. In the field of wireless sensor network, this paper introduces the status quo of tuning tools for stringed instruments, analyses the existing problems of tuning tools for mainstream stringed instruments, and puts forward an automatic tuning system based on wireless sensor network. The hardware design of intelligent t
APA, Harvard, Vancouver, ISO, and other styles
12

Herminarto, Nugroho, Kunto Wibowo Wahyu, Rahma Annisa Aulia, and Megawati Rosalinda Hanny. "Deep Learning for Tuning Optical Beamforming Networks." TELKOMNIKA Telecommunication, Computing, Electronics and Control 16, no. 4 (2018): 1607–15. https://doi.org/10.12928/TELKOMNIKA.v16i4.8176.

Full text
Abstract:
In communication between planes and satellites, Optical Beamforming Networks (OBFNs), which rely on many small and flat Phased Array Antennas (PAAs), need to be tuned in order to receive signals from specific angles. In this paper, we develop a deep neural network representation of tuning OBFNs. The problem of tuning an OBFN is in many aspects similar to training a deep neural network. We present a way to exploit the special structure of OBFNs into deep neural network and an algorithm for tuning OBFNs based on feedback that can be easily measured in real system. Training data, which consists o
APA, Harvard, Vancouver, ISO, and other styles
13

Willemen, Pieter, Daniela Laselva, Yu Wang, Istvan Kovács, Relja Djapic, and Ingrid Moerman. "SON for LTE-WLAN access network selection: design and performance." EURASIP Journal on Wireless Communications and Networking 2016, no. 1 (2016): 230. https://doi.org/10.1186/s13638-016-0726-x.

Full text
Abstract:
Mobile network operators (MNOs) are deploying carrier-grade Wireless Local Area Network (WLAN) as an important complementary system to cellular networks. Access network selection (ANS) between cellular and WLAN is an essential component to improve network performance and user quality-of-service (QoS) via controlled loading of these systems. In emerging heterogeneous networks characterized by different cell sizes and diverse WLAN deployments, automatic tuning of the network selection functionality plays a crucial role. In this article, we present two distinct Self-Organizing Network (SON) schem
APA, Harvard, Vancouver, ISO, and other styles
14

Nasir, Muhammad Umar, Taher M. Ghazal, Muhammad Adnan Khan, et al. "Breast Cancer Prediction Empowered with Fine-Tuning." Computational Intelligence and Neuroscience 2022 (June 9, 2022): 1–9. http://dx.doi.org/10.1155/2022/5918686.

Full text
Abstract:
In the world, in the past recent five years, breast cancer is diagnosed about 7.8 million women’s and making it the most widespread cancer, and it is the second major reason for women’s death. So, early prevention and diagnosis systems of breast cancer could be more helpful and significant. Neural networks can extract multiple features automatically and perform predictions on breast cancer. There is a need for several labeled images to train neural networks which is a nonconventional method for some types of data images such as breast magnetic resonance imaging (MRI) images. So, there is only
APA, Harvard, Vancouver, ISO, and other styles
15

PURNAMA, AFATAH, EKA SETIA NUGRAHA, and MUNTAQO ALFIN AMANAF. "Penerapan Metode ACP untuk Optimasi Physical Tuning Antena Sektoral pada Jaringan 4G LTE di Kota Purwokerto." ELKOMIKA: Jurnal Teknik Energi Elektrik, Teknik Telekomunikasi, & Teknik Elektronika 8, no. 1 (2020): 138. http://dx.doi.org/10.26760/elkomika.v8i1.138.

Full text
Abstract:
ABSTRAK
 
 Kualitas jaringan 4G (LTE) yang masih tidak stabil sehingga menyebabkan bad coverage. Untuk meningkatkan kualitas jarigan 4G (LTE) dapat dilakukan dengan optimasi physical tuning antena sektoral. Physical tuning antena sektoral meliputi perubahan tinggi antena, azimuth dan tilting antena. Pada penelitian ini dilakukan optimasi physical tuning antena sektoral menggunakan metode Automatic Cell Planning (ACP) untuk memenuhi kebutuhan coverage di daerah Purwokerto Barat dan Purwokerto Utara. Perolehan persentase coverage site existing belum memenuhi standar KPI Operator untuk
APA, Harvard, Vancouver, ISO, and other styles
16

Ge, Yisu, Shufang Lu, and Fei Gao. "Small Network for Lightweight Task in Computer Vision: A Pruning Method Based on Feature Representation." Computational Intelligence and Neuroscience 2021 (April 17, 2021): 1–12. http://dx.doi.org/10.1155/2021/5531023.

Full text
Abstract:
Many current convolutional neural networks are hard to meet the practical application requirement because of the enormous network parameters. For accelerating the inference speed of networks, more and more attention has been paid to network compression. Network pruning is one of the most efficient and simplest ways to compress and speed up the networks. In this paper, a pruning algorithm for the lightweight task is proposed, and a pruning strategy based on feature representation is investigated. Different from other pruning approaches, the proposed strategy is guided by the practical task and
APA, Harvard, Vancouver, ISO, and other styles
17

Lee, Yong-Seok, and Dong-Won Jang. "Optimization of Neural Network-Based Self-Tuning PID Controllers for Second Order Mechanical Systems." Applied Sciences 11, no. 17 (2021): 8002. http://dx.doi.org/10.3390/app11178002.

Full text
Abstract:
The feasibility of a neural network method was discussed in terms of a self-tuning proportional–integral–derivative (PID) controller. The proposed method was configured with two neural networks to dramatically reduce the number of tuning attempts with a practically achievable small amount of data acquisition. The first network identified the target system from response data, previous PID parameters, and response characteristics. The second network recommended PID parameters based on the results of the first network. The results showed that it could recommend PID parameters within 2 s of observ
APA, Harvard, Vancouver, ISO, and other styles
18

Khandish, Batbayar, and Jung-Bong Suk. "A Network Traffic Aware Tuning Algorithm in Wireless Sensor Networks." Wireless Personal Communications 118, no. 1 (2021): 431–45. http://dx.doi.org/10.1007/s11277-020-08021-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

LIN, BEY-CHI, and F. K. HWANG. "GENERALIZING AND FINE TUNING TRIPLE-LOOP NETWORKS." Journal of Interconnection Networks 10, no. 01n02 (2009): 133–48. http://dx.doi.org/10.1142/s0219265909002479.

Full text
Abstract:
Several triple-loop networks have been recently proposed and their efficiency studied. However, the number of cases for which one of these networks exists is sparse. We extend these networks to larger classes to enhance their realizability. We also give a heuristic method to optimize the network parameters to increase their efficiency. Finally, we determine their wide diameters to understand their fault-tolerance performance.
APA, Harvard, Vancouver, ISO, and other styles
20

Anand, Manish, Edmund B. Nightingale, and Jason Flinn. "Self-Tuning Wireless Network Power Management." Wireless Networks 11, no. 4 (2005): 451–69. http://dx.doi.org/10.1007/s11276-005-1768-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Pearlmutter, Barak A., and Conor J. Houghton. "A New Hypothesis for Sleep: Tuning for Criticality." Neural Computation 21, no. 6 (2009): 1622–41. http://dx.doi.org/10.1162/neco.2009.05-08-787.

Full text
Abstract:
We propose that the critical function of sleep is to prevent uncontrolled neuronal feedback while allowing rapid responses and prolonged retention of short-term memories. Through learning, the brain is tuned to react optimally to environmental challenges. Optimal behavior often requires rapid responses and the prolonged retention of short-term memories. At a neuronal level, these correspond to recurrent activity in local networks. Unfortunately, when a network exhibits recurrent activity, small changes in the parameters or conditions can lead to runaway oscillations. Thus, the very changes tha
APA, Harvard, Vancouver, ISO, and other styles
22

Imran Khan, Archit Joshi, FNU Antara, Dr Satendra Pal Singh, Om Goel, and Shalu Jain. "Performance Tuning of 5G Networks Using AI and Machine Learning Algorithms." International Journal for Research Publication and Seminar 11, no. 4 (2020): 406–23. http://dx.doi.org/10.36676/jrps.v11.i4.1589.

Full text
Abstract:
As the demand for faster and more reliable mobile networks intensifies, the deployment of 5G has emerged as a transformative solution to meet the growing needs of connectivity. However, to fully leverage the potential of 5G networks, it is crucial to optimize their performance. This paper explores the application of Artificial Intelligence (AI) and Machine Learning (ML) algorithms in the performance tuning of 5G networks, focusing on areas such as network resource allocation, traffic prediction, and real-time decision-making. By analyzing vast datasets generated by 5G infrastructures, AI and M
APA, Harvard, Vancouver, ISO, and other styles
23

Matychenko, Anastasiia D., and Marina V. Polyakova. "The structural tuning of the convolutional neural network forspeaker identification in mel frequency cepstrumcoefficients space." Herald of Advanced Information Technology 6, no. 2 (2023): 115–27. http://dx.doi.org/10.15276/hait.06.2023.7.

Full text
Abstract:
As a result of the literature analysis, the main methods for speaker identification from speech signals were defined. These are statistical methods based on Gaussian mixture model and a universal background model, as well as neural network methods, in particular, using convolutional or Siamese neural networks. The main characteristics of these methods are the recognition performance, a number of parameters, and the training time. High recognition performance is achieved by using convolutional neural networks, but a number of parameters of these networks are much higher than for statistical met
APA, Harvard, Vancouver, ISO, and other styles
24

Kadek Eka Sapta Wijaya, Gede Angga Pradipta, and Dadang Hermawan. "The Implementation of Bayesian Optimization for Automatic Parameter Selection in Convolutional Neural Network for Lung Nodule Classification." Jurnal Nasional Pendidikan Teknik Informatika (JANAPATI) 13, no. 3 (2024): 438–49. https://doi.org/10.23887/janapati.v13i3.82467.

Full text
Abstract:
Lung cancer's high mortality rate makes early detection crucial. Machine learning techniques, especially convolutional neural networks (CNN), play a very important role in lung nodule detection. Traditional machine learning approaches often require manual feature extraction, while CNNs, as a specialized type of neural network, automatically learn features directly from the data. However, tuning CNN hyperparameters, such as network structure and training parameters, is computationally intensive. Bayesian Optimization offers a solution by efficiently selecting parameter values. This study develo
APA, Harvard, Vancouver, ISO, and other styles
25

Alhussan, Amel, and Khalil El Hindi. "Selectively Fine-Tuning Bayesian Network Learning Algorithm." International Journal of Pattern Recognition and Artificial Intelligence 30, no. 08 (2016): 1651005. http://dx.doi.org/10.1142/s0218001416510058.

Full text
Abstract:
In this work, we propose a Selective Fine-Tuning algorithm for Bayesian Networks (SFTBN). The aim is to enhance the accuracy of Bayesian Network (BN) classifiers by finding better estimations for the probability terms used by the classifiers. The algorithm augments a BN learning algorithm with a fine-tuning stage that aims to more accurately estimate the probability terms used by the BN. If the value of a probability term causes a misclassification of a training instances and falls outside its valid range then we update (fine-tune) that value. The amount of such an update is proportional to th
APA, Harvard, Vancouver, ISO, and other styles
26

Lee, Gwo-Chuan, Jyun-Hong Li, and Zi-Yang Li. "A Wasserstein Generative Adversarial Network–Gradient Penalty-Based Model with Imbalanced Data Enhancement for Network Intrusion Detection." Applied Sciences 13, no. 14 (2023): 8132. http://dx.doi.org/10.3390/app13148132.

Full text
Abstract:
In today’s network intrusion detection systems (NIDS), certain types of network attack packets are sparse compared to regular network packets, making them challenging to collect, and resulting in significant data imbalances in public NIDS datasets. With respect to attack types with rare data, it is difficult to classify them, even by using various algorithms such as machine learning and deep learning. To address this issue, this study proposes a data augmentation technique based on the WGAN-GP model to enhance the recognition accuracy of sparse attacks in network intrusion detection. The enhan
APA, Harvard, Vancouver, ISO, and other styles
27

Giannakakis, Emmanouil, Oleg Vinogradov, Victor Buendía, and Anna Levina. "Structural influences on synaptic plasticity: The role of presynaptic connectivity in the emergence of E/I co-tuning." PLOS Computational Biology 20, no. 10 (2024): e1012510. http://dx.doi.org/10.1371/journal.pcbi.1012510.

Full text
Abstract:
Cortical neurons are versatile and efficient coding units that develop strong preferences for specific stimulus characteristics. The sharpness of tuning and coding efficiency is hypothesized to be controlled by delicately balanced excitation and inhibition. These observations suggest a need for detailed co-tuning of excitatory and inhibitory populations. Theoretical studies have demonstrated that a combination of plasticity rules can lead to the emergence of excitation/inhibition (E/I) co-tuning in neurons driven by independent, low-noise signals. However, cortical signals are typically noisy
APA, Harvard, Vancouver, ISO, and other styles
28

Aribowo, Widi. "Focused Time Delay Neural Network For Tuning Automatic Voltage Regulator." EMITTER International Journal of Engineering Technology 7, no. 1 (2019): 34–43. http://dx.doi.org/10.24003/emitter.v7i1.315.

Full text
Abstract:
This paper proposes a novel controller for automatic voltage regulator (AVR) system. The controller is used Focused Time Delay Neural Network (FTDNN). It does not require dynamic backpropagation to compute the network gradient. FTDNN AVR can train network faster than other dynamic networks. Simulation was performed to compare load angle (load angle) and Speed. The performance of the system with FTDNN-AVR has compared with a Conventional AVR (C-AVR) and RNN AVR. Simulations in Matlab/Simulink show the effectiveness of FTDNN-AVR design, and superior robust performance with different cases.
APA, Harvard, Vancouver, ISO, and other styles
29

Gurskiy, A. A., A. E. Goncharenko, and S. M. Dubna. "AUTOMATIC SYNTHESIS OF PETRI NETS AT TUNING UP OF THE COORDINATING AUTOMATIC CONTROL SYSTEMS." ELECTRICAL AND COMPUTER SYSTEMS 33, no. 108 (2020): 34–44. http://dx.doi.org/10.15276/eltecs.32.108.2020.4.

Full text
Abstract:
The process of automated tuning for the coordinating automatic control system is considered in this paper. This process of tuning for the coordinating control system is linked to the automatic synthesis of Petri nets based on functioning of the artificial neural network. Thereby, we can automate the process of tuning and synthesis of system models and also solve the urgent task linked to the minimization of tuning time for the multilevel control systems. The purposes of the scientific work are time reduction of the tuning and automatization of the tuning for the multilevel coordinating systems
APA, Harvard, Vancouver, ISO, and other styles
30

Barabás, Béla, Ottilia Fülöp, and Gyula Pályi. "Fine-Tuning of Aspects of Chirality by Co-Authorship Networks." Symmetry 17, no. 6 (2025): 825. https://doi.org/10.3390/sym17060825.

Full text
Abstract:
In the present article, we illustrate and analyze the co-authorship network of Paul G. Mezey, focusing only on his collaborations on chirality-related papers. We consider scientific works from the Web of Science database as of 10 April 2024. Unlike previous studies on co-authorship networks, this network allows parallel edges, indicating multiple collaborations between the scientists involved. We also present a co-authorship network based on articles citing Mezey’s chirality-related papers (excluding self-citations), examining its main communities detected. Publications on the development of t
APA, Harvard, Vancouver, ISO, and other styles
31

Kandel, Ibrahem, and Mauro Castelli. "How Deeply to Fine-Tune a Convolutional Neural Network: A Case Study Using a Histopathology Dataset." Applied Sciences 10, no. 10 (2020): 3359. http://dx.doi.org/10.3390/app10103359.

Full text
Abstract:
Accurate classification of medical images is of great importance for correct disease diagnosis. The automation of medical image classification is of great necessity because it can provide a second opinion or even a better classification in case of a shortage of experienced medical staff. Convolutional neural networks (CNN) were introduced to improve the image classification domain by eliminating the need to manually select which features to use to classify images. Training CNN from scratch requires very large annotated datasets that are scarce in the medical field. Transfer learning of CNN wei
APA, Harvard, Vancouver, ISO, and other styles
32

Martínez, Andrea, Anna Sikora, Eduardo César, and Joan Sorribes. "ELASTIC: A Large Scale Dynamic Tuning Environment." Scientific Programming 22, no. 4 (2014): 261–71. http://dx.doi.org/10.1155/2014/403695.

Full text
Abstract:
The spectacular growth in the number of cores in current supercomputers poses design challenges for the development of performance analysis and tuning tools. To be effective, such analysis and tuning tools must be scalable and be able to manage the dynamic behaviour of parallel applications. In this work, we present ELASTIC, an environment for dynamic tuning of large-scale parallel applications. To be scalable, the architecture of ELASTIC takes the form of a hierarchical tuning network of nodes that perform a distributed analysis and tuning process. Moreover, the tuning network topology can be
APA, Harvard, Vancouver, ISO, and other styles
33

Guo, Yunhui, Yandong Li, Liqiang Wang, and Tajana Rosing. "AdaFilter: Adaptive Filter Fine-Tuning for Deep Transfer Learning." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 4060–66. http://dx.doi.org/10.1609/aaai.v34i04.5824.

Full text
Abstract:
There is an increasing number of pre-trained deep neural network models. However, it is still unclear how to effectively use these models for a new task. Transfer learning, which aims to transfer knowledge from source tasks to a target task, is an effective solution to this problem. Fine-tuning is a popular transfer learning technique for deep neural networks where a few rounds of training are applied to the parameters of a pre-trained model to adapt them to a new task. Despite its popularity, in this paper we show that fine-tuning suffers from several drawbacks. We propose an adaptive fine-tu
APA, Harvard, Vancouver, ISO, and other styles
34

Shin, Sungho, Jinhwan Park, Yoonho Boo, and Wonyong Sung. "HLHLp: Quantized Neural Networks Training for Reaching Flat Minima in Loss Surface." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 5784–91. http://dx.doi.org/10.1609/aaai.v34i04.6035.

Full text
Abstract:
Quantization of deep neural networks is extremely essential for efficient implementations. Low-precision networks are typically designed to represent original floating-point counterparts with high fidelity, and several elaborate quantization algorithms have been developed. We propose a novel training scheme for quantized neural networks to reach flat minima in the loss surface with the aid of quantization noise. The proposed training scheme employs high-low-high-low precision in an alternating manner for network training. The learning rate is also abruptly changed at each stage for coarse- or
APA, Harvard, Vancouver, ISO, and other styles
35

Elias, Israel, José de Jesús Rubio, David Ricardo Cruz, et al. "Hessian with Mini-Batches for Electrical Demand Prediction." Applied Sciences 10, no. 6 (2020): 2036. http://dx.doi.org/10.3390/app10062036.

Full text
Abstract:
The steepest descent method is frequently used for neural network tuning. Mini-batches are commonly used to get better tuning of the steepest descent in the neural network. Nevertheless, steepest descent with mini-batches could be delayed in reaching a minimum. The Hessian could be quicker than the steepest descent in reaching a minimum, and it is easier to achieve this goal by using the Hessian with mini-batches. In this article, the Hessian is combined with mini-batches for neural network tuning. The discussed algorithm is applied for electrical demand prediction.
APA, Harvard, Vancouver, ISO, and other styles
36

Mathan, Pinaki Shashishekhar. "Intrusion Detection Using Machine Learning Classification and Regression." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 03 (2025): 1–9. https://doi.org/10.55041/ijsrem42130.

Full text
Abstract:
An Intrusion Detection System (IDS) is a crucial security mechanism designed to protect computer networks from unauthorized access and cyber threats. With the rapid expansion of Internet-based data transmission, ensuring network security has become increasingly challenging. IDS continuously monitors and analyzes network traffic to detect malicious activities, relying on datasets like KDD Cup 1999 for training and evaluation. Effective IDS development involves preprocessing steps such as feature selection, normalization, and addressing data imbalance to enhance detection accuracy. Various machi
APA, Harvard, Vancouver, ISO, and other styles
37

S. Mahmoud, Magdi, and Matasm M. Hassan Hamid. "Self-Tuning Control for MIMO Network Systems." Journal of Signal and Information Processing 03, no. 02 (2012): 154–60. http://dx.doi.org/10.4236/jsip.2012.32020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Körösi, Ladislav, and Štefan Kozák. "OPTIMAL SELF TUNING NEURAL NETWORK CONTROLLER DESIGN." IFAC Proceedings Volumes 38, no. 1 (2005): 384–89. http://dx.doi.org/10.3182/20050703-6-cz-1902.01142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

M., Paul Asir, Jeevarekha A., and Philominathan P. "Tuning chaos in network sharing common nonlinearity." Communications in Nonlinear Science and Numerical Simulation 35 (June 2016): 148–65. http://dx.doi.org/10.1016/j.cnsns.2015.11.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Martynov, A. V., G. S. Nikonova, and А. N. Kondratyev. "IMPLEMENTATION OF ADAPTIVE PRODUCT CUSTOMIZATION PROCESSES IN MASS PRODUCTION." RADIO COMMUNICATION TECHNOLOGY, no. 51 (December 30, 2021): 52–58. http://dx.doi.org/10.33286/2075-8693-2021-51-52-58.

Full text
Abstract:
The article presents the results of research and implementation of adaptive calibration systems used in setting up mass-produced products. The proposed method allows you to speed up the tuning of the HF transceiver. In the process of tuning, with the help of a neural network, an adaptive correction of the approximating function of the incident wave sensor is made according to the accumulated data, which makes it possible to reduce the time for tuning the transceiver. The introduction of the mathematical apparatus of neural networks can be applied in the process of mass production for other pro
APA, Harvard, Vancouver, ISO, and other styles
41

RAJA, M., Kartikay SINGH, Aishwerya SINGH, and Ayush GUPTA. "Design of Satellite Attitude Control Systems using Adaptive Neural Networks." INCAS BULLETIN 12, no. 3 (2020): 173–82. http://dx.doi.org/10.13111/2066-8201.2020.12.3.14.

Full text
Abstract:
This paper investigates the performance of adaptive neural networks through simulations for satellite systems involving three-axis attitude control algorithms. PID tuning is the method employed traditionally. An optimally tuned, to minimizes the deviation from set point. It also responds quickly to the disturbances with some minimal overshoot. However, the disadvantage of poor performance has been observed in these controllers when manual tuning is used which in itself a monotonous process is. The PID controller using Ziegler-Nichols has more transient responses of satellite such as Overshoot,
APA, Harvard, Vancouver, ISO, and other styles
42

Li, Xiaowei, Feng Liu, Defei Li, Tianchi Hu, and Mu Han. "Illegal Intrusion Detection for In-Vehicle CAN Bus Based on Immunology Principle." Symmetry 14, no. 8 (2022): 1532. http://dx.doi.org/10.3390/sym14081532.

Full text
Abstract:
The controller area network (CAN) bus has become one of the most commonly used protocols in automotive networks. Some potential attackers inject malicious data packets into the CAN bus through external interfaces for implementing illegal operations (intrusion). Anomaly detection is a technique for network intrusion detection which can detect malicious data packs by comparing the normal data packets with incoming data packets obtained from the network traffic. The data of a normal network is in a symmetric and stable state, which will become asymmetric when compromised. Considering the in-vehic
APA, Harvard, Vancouver, ISO, and other styles
43

Mencl, W. Einar. "Effects of Tuning Sharpness on Tone Categorization by Self-Organizing Neural Networks." Musicae Scientiae 2, no. 2 (1998): 129–41. http://dx.doi.org/10.1177/102986499800200202.

Full text
Abstract:
Because humans can categorize tones either in terms of pitch height or pitch class (chroma), I have explored the potential of simple self-organizing networks to demonstrate these two different capacities, and have uncovered a mechanism which can subserve both. A self-organizing neural network architecture was used; during training, the network learns to represent internally the co-existence of stimulus features (here, harmonic components). Two sets of simulations were completed, identical in all respects except for the tuning sharpness used at the input layer. In the broad tuning condition, th
APA, Harvard, Vancouver, ISO, and other styles
44

Dawodi, Mursal, and Jawid Ahmad Baktash. "Tuning Dari Speech Classification Employing Deep Neural Networks." International Journal on Natural Language Computing 12, no. 2 (2023): 25–36. http://dx.doi.org/10.5121/ijnlc.2023.12203.

Full text
Abstract:
Recently, many researchers have focused on building and improving speech recognition systems to facilitate and enhance human-computer interaction. Today, Automatic Speech Recognition (ASR) system has become an important and common tool from games to translation systems, robots, and so on. However, there is still a need for research on speech recognition systems for low-resource languages. This article deals with the recognition of a separate word for Dari language, using Mel-frequency cepstral coefficients (MFCCs) feature extraction method and three different deep neural networks including Con
APA, Harvard, Vancouver, ISO, and other styles
45

Mursal, Dawodi, and Ahmad Baktash Jawid. "Tuning Dari Speech Classification Employing Deep Neural Networks." International Journal on Natural Language Computing (IJNLC) 12, no. 2 (2023): 12. https://doi.org/10.5281/zenodo.7969049.

Full text
Abstract:
Recently, many researchers have focused on building and improving speech recognition systems to facilitate and enhance human-computer interaction. Today, Automatic Speech Recognition (ASR) system has become an important and common tool from games to translation systems, robots, and so on. However, there is still a need for research on speech recognition systems for low-resource languages. This article deals with the recognition of a separate word for Dari language, using Mel-frequency cepstral coefficients (MFCCs) feature extraction method and three different deep neural networks including Con
APA, Harvard, Vancouver, ISO, and other styles
46

Mendoza, Jessica, Isabel de-la-Bandera, David Palacios, and Raquel Barco. "QoE Optimization in a Live Cellular Network through RLC Parameter Tuning." Sensors 21, no. 16 (2021): 5619. http://dx.doi.org/10.3390/s21165619.

Full text
Abstract:
The mobile communication networks sector has experienced a great evolution during the last few years. The emergence of new services as well as the growth in the number of subscribers have motivated the search for new ways to optimize mobile networks. In this way, the objective pursued by optimization techniques has been evolving, shifting from the traditional optimization of radio parameters to the improvement of the quality perceived by users, known as quality of experience (QoE). In mobile networks, the radio link control (RLC) layer provides a reliable link between both ends of the communic
APA, Harvard, Vancouver, ISO, and other styles
47

Yunfei Song, Yunfei Song, Yujia Zhu Yujia Zhu, 劉洋 劉洋, and Daoxun Xia Liu Yang. "Black Box Watermarking for DNN Model Integrity Detection Using Label Loss." 電腦學刊 35, no. 4 (2024): 277–90. http://dx.doi.org/10.53106/199115992024083504019.

Full text
Abstract:
<p>After significant investments of time and resources, the accuracy of deep neural network (DNN) models has reached commercially viable levels, leading to their increasing deployment on cloud platforms for commercial services. However, ongoing research indicates that the challenges facing deep neural network models are continually evolving, particularly with various attacks emerging to compromise their integrity. Deep neural networks are susceptible to poisoning attacks and backdoor attacks, both of which involve malicious fine-tuning of the deep models. Malicious fine-tuning can lead t
APA, Harvard, Vancouver, ISO, and other styles
48

Anisimov, A. A., M. E. Sorokovnin, and S. V. Tararykin. "Improving the accuracy of identification and tuning of linear systems with state controllers using an artificial neural network." Vestnik IGEU, no. 6 (December 28, 2023): 57–68. http://dx.doi.org/10.17588/2072-2672.2023.6.057-068.

Full text
Abstract:
High potential capabilities of control systems with state controllers can be realized only if automatic tuning tools are available. Since the tuning is carried out in real-time mode, which places increased demands on performance, it is proposed to use an artificial neural network to reduce its duration. However, under the conditions of noise in the measurement channels, the quality of identification of the parameters of the control object is significantly reduced. In this regard, the aim of the study is to find the optimal composition of measurement channels at the network input, which allows
APA, Harvard, Vancouver, ISO, and other styles
49

Djellali, Choukri, and Mehdi adda. "An Enhanced Deep Learning Model to Network Attack Detection, by using Parameter Tuning, Hidden Markov Model and Neural Network." Journal of Ubiquitous Systems and Pervasive Networks 15, no. 01 (2021): 35–41. http://dx.doi.org/10.5383/juspn.15.01.005.

Full text
Abstract:
In recent years, Deep Learning has become a critical success factor for Machine Learning. In the present study, we introduced a Deep Learning model to network attack detection, by using Hidden Markov Model and Artificial Neural Networks. We used a model aggregation technique to find a single consolidated Deep Learning model for better data fitting. The model selection technique is applied to optimize the bias-variance trade-off of the expected prediction. We demonstrate its ability to reduce the convergence, reach the optimal solution and obtain more cluttered decision boundaries. Experimental
APA, Harvard, Vancouver, ISO, and other styles
50

Yu, Xuhu, Zhong Wan, Zehao Shi, and Lei Wang. "Lipreading Using Liquid State Machine with STDP-Tuning." Applied Sciences 12, no. 20 (2022): 10484. http://dx.doi.org/10.3390/app122010484.

Full text
Abstract:
Lipreading refers to the task of decoding the text content of a speaker based on visual information about the movement of the speaker’s lips. With the development of deep learning in recent years, lipreading has attracted extensive research. However, the deep learning method requires a lot of computing resources, which is not conducive to the migration of the system to edge devices. Inspired by the work of Spiking Neural Networks (SNNs) in recognizing human actions and gestures, we propose a lipreading system based on SNNs. Specifically, we construct the front-end feature extractor of the syst
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!