Academic literature on the topic 'Network tuning'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Network tuning.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Network tuning"

1

Wani, M. Arif, and Saduf Afzal. "Optimization of deep network models through fine tuning." International Journal of Intelligent Computing and Cybernetics 11, no. 3 (2018): 386–403. http://dx.doi.org/10.1108/ijicc-06-2017-0070.

Full text
Abstract:
Purpose Many strategies have been put forward for training deep network models, however, stacking of several layers of non-linearities typically results in poor propagation of gradients and activations. The purpose of this paper is to explore the use of two steps strategy where initial deep learning model is obtained first by unsupervised learning and then optimizing the initial deep learning model by fine tuning. A number of fine tuning algorithms are explored in this work for optimizing deep learning models. This includes proposing a new algorithm where Backpropagation with adaptive gain alg
APA, Harvard, Vancouver, ISO, and other styles
2

Кonarev, D., and А. Gulamov. "ACCURACY IMPROVING OF PRE-TRAINED NEURAL NETWORKS BY FINE TUNING." EurasianUnionScientists 5, no. 1(82) (2021): 26–28. http://dx.doi.org/10.31618/esu.2413-9335.2021.5.82.1231.

Full text
Abstract:
Methods of accuracy improving of pre-trained networks are discussed. Images of ships are input data for the networks. Networks are built and trained using Keras and TensorFlow machine learning libraries. Fine tuning of previously trained convoluted artificial neural networks for pattern recognition tasks is described. Fine tuning of VGG16 and VGG19 networks are done by using Keras Applications. The accuracy of VGG16 network with finetuning of the last convolution unit increased from 94.38% to 95.21%. An increase is only 0.83%. The accuracy of VGG19 network with fine-tuning of the last convolut
APA, Harvard, Vancouver, ISO, and other styles
3

Machens, Christian K., and Carlos D. Brody. "Design of Continuous Attractor Networks with Monotonic Tuning Using a Symmetry Principle." Neural Computation 20, no. 2 (2008): 452–85. http://dx.doi.org/10.1162/neco.2007.07-06-297.

Full text
Abstract:
Neurons that sustain elevated firing in the absence of stimuli have been found in many neural systems. In graded persistent activity, neurons can sustain firing at many levels, suggesting a widely found type of network dynamics in which networks can relax to any one of a continuum of stationary states. The reproduction of these findings in model networks of nonlinear neurons has turned out to be nontrivial. A particularly insightful model has been the “bump attractor,” in which a continuous attractor emerges through an underlying symmetry in the network connectivity matrix. This model, however
APA, Harvard, Vancouver, ISO, and other styles
4

Ge, Changhan, Ajay Mahimkar, Zihui Ge, et al. "Iridescence: Improving Configuration Tuning in the Presence of Confounders for 5G NSA Networks." Proceedings of the ACM on Networking 3, CoNEXT1 (2025): 1–22. https://doi.org/10.1145/3709378.

Full text
Abstract:
Configuration tuning is one of the top network operational tasks for Cellular Service Providers (CSPs), and is typically done to either restore performance during degraded network conditions such as congestion, failure, planned upgrades, or optimize service performance through change trials. A long-standing challenge in tuning has been to associate a causal relationship between a configuration change and a service performance impact. Confounders (or, external factors) make this extremely hard. In this paper, we focus on improving configuration tuning in the presence of confounders for 5G Non-s
APA, Harvard, Vancouver, ISO, and other styles
5

Menapace, Andrea, Ariele Zanfei, and Maurizio Righetti. "Tuning ANN Hyperparameters for Forecasting Drinking Water Demand." Applied Sciences 11, no. 9 (2021): 4290. http://dx.doi.org/10.3390/app11094290.

Full text
Abstract:
The evolution of smart water grids leads to new Big Data challenges boosting the development and application of Machine Learning techniques to support efficient and sustainable drinking water management. These powerful techniques rely on hyperparameters making the models’ tuning a tricky and crucial task. We hence propose an insightful analysis of the tuning of Artificial Neural Networks for drinking water demand forecasting. This study focuses on layers and nodes’ hyperparameters fitting of different Neural Network architectures through a grid search method by varying dataset, prediction hori
APA, Harvard, Vancouver, ISO, and other styles
6

Freitag, J., N. L. S. da Fonseca, and J. F. de Rezende. "Tuning of 802.11e network parameters." IEEE Communications Letters 10, no. 8 (2006): 611–13. http://dx.doi.org/10.1109/lcomm.2006.1665127.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Junhong, and Jouni Lampinen. "Approximation by Growing Radial Basis Function Networks Using the Differential-Evolution-Based Algorithm." Journal of Advanced Computational Intelligence and Intelligent Informatics 9, no. 5 (2005): 540–48. http://dx.doi.org/10.20965/jaciii.2005.p0540.

Full text
Abstract:
The differential evolution (DE) algorithm is a floating-point-encoded evolutionary algorithm for global optimization. We applied a DE-based method to training radial basis function (RBF) networks with variables including centers, weights, and widths. This algorithm consists of three steps – initial tuning focusing on finding the center of a one-node RBF network, local tuning, and global tuning both using cycling schemes to find RBF network parameters. The mean square error from desired output to actual network output is applied as the objective function to be minimized. Network training is sho
APA, Harvard, Vancouver, ISO, and other styles
8

Li, Tianxi, Elizaveta Levina, and Ji Zhu. "Network cross-validation by edge sampling." Biometrika 107, no. 2 (2020): 257–76. http://dx.doi.org/10.1093/biomet/asaa006.

Full text
Abstract:
Summary While many statistical models and methods are now available for network analysis, resampling of network data remains a challenging problem. Cross-validation is a useful general tool for model selection and parameter tuning, but it is not directly applicable to networks since splitting network nodes into groups requires deleting edges and destroys some of the network structure. In this paper we propose a new network resampling strategy, based on splitting node pairs rather than nodes, that is applicable to cross-validation for a wide range of network model selection tasks. We provide th
APA, Harvard, Vancouver, ISO, and other styles
9

Polyakova, M. V. "RCF-ST: RICHER CONVOLUTIONAL FEATURES NETWORK WITH STRUCTURAL TUNING FOR THE EDGE DETECTION ON NATURAL IMAGES." Radio Electronics, Computer Science, Control, no. 4 (January 4, 2024): 122. http://dx.doi.org/10.15588/1607-3274-2023-4-12.

Full text
Abstract:
Context. The problem of automating of the edge detection on natural images in intelligent systems is considered. The subject of the research is the deep learning convolutional neural networks for edge detection on natural images.
 Objective. The objective of the research is to improve the edge detection performance of natural images by structural tuning the richer convolutional features network architecture.
 Method. In general, the edge detection performance is influenced by a neural network architecture. To automate the design of the network structure in the paper a structural tuni
APA, Harvard, Vancouver, ISO, and other styles
10

Ha, Seokhyeon, Sunbeom Jeong, and Jungwoo Lee. "Domain-Aware Fine-Tuning: Enhancing Neural Network Adaptability." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 11 (2024): 12261–69. http://dx.doi.org/10.1609/aaai.v38i11.29116.

Full text
Abstract:
Fine-tuning pre-trained neural network models has become a widely adopted approach across various domains. However, it can lead to the distortion of pre-trained feature extractors that already possess strong generalization capabilities. Mitigating feature distortion during adaptation to new target domains is crucial. Recent studies have shown promising results in handling feature distortion by aligning the head layer on in-distribution datasets before performing fine-tuning. Nonetheless, a significant limitation arises from the treatment of batch normalization layers during fine-tuning, leadin
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Network tuning"

1

Lekutai, Gaviphat. "Adaptive Self-Tuning Neuro Wavelet Network Controllers." Diss., Virginia Tech, 1997. http://hdl.handle.net/10919/30603.

Full text
Abstract:
Single layer feed forward neural networks with hidden nodes of adaptive wavelet functions (wavenets) have been successfully demonstrated to have potential in many applications. Yet applications in the process control area have not been investigated. In this paper an application to a self-tuning design method for an unknown nonlinear system is presented. Different types of frame wavelet functions are integrated for their simplicity, availability, and capability of constructing adaptive controllers. Infinite impulse response (IIR) recurrent structures are combined in cascade to the network to
APA, Harvard, Vancouver, ISO, and other styles
2

Dolson, David C. "Attentive object recognition in the selective tuning network." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/mq29238.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Thompson, Mark. "Controlling the Pi impedance matching network for fast antenna tuning." Thesis, University of York, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.245897.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Zhe. "Optimizing neural network structures: faster speed, smaller size, less tuning." Diss., University of Iowa, 2018. https://ir.uiowa.edu/etd/6460.

Full text
Abstract:
Deep neural networks have achieved tremendous success in many domains (e.g., computer vision~\cite{Alexnet12,vggnet15,fastrcnn15}, speech recognition~\cite{hinton2012deep,dahl2012context}, natural language processing~\cite{dahl2012context,collobert2011natural}, games~\cite{silver2017mastering,silver2016mastering}), however, there are still many challenges in deep learning comunity such as how to speed up training large deep neural networks, how to compress large nerual networks for mobile/embed device without performance loss, how to automatically design the optimal network structures for a ce
APA, Harvard, Vancouver, ISO, and other styles
5

Boyer, de la Giroday Anna. "Automatic fine tuning of cavity filters." Thesis, Linköpings universitet, Artificiell intelligens och integrerad datorsystem, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-129576.

Full text
Abstract:
Cavity filters are a necessary component in base stations used for telecommunication. Without these filters it would not be possible for base stations to send and receive signals at the same time. Today these cavity filters require fine tuning by humans before they can be deployed. This thesis have designed and implemented a neural network that can tune cavity filters. Different types of design parameters have been evaluated, such as neural network architecture, data presentation and data preprocessing. While the results was not comparable to human fine tuning, it was shown that there was a re
APA, Harvard, Vancouver, ISO, and other styles
6

Visaggi, Salvatore. "Multimodal Side-Tuning for Code Snippets Programming Language Recognition." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/22993/.

Full text
Abstract:
Identificare in modo automatico il linguaggio di programmazione di una porzione di codice sorgente è uno dei temi che ancora oggi presenta diverse difficoltà. Il numero di linguaggi di programmazione, la quantità di codice pubblicato e reso open source, e il numero di sviluppatori che producono e pubblicano nuovo codice sorgente è in continuo aumento. Le motivazioni che richiedono la necessità di disporre di strumenti in grado di riconoscere il tipo di linguaggio per snippet di codice sorgente sono svariate. Ad esempio, tali strumenti trovano applicazione in ambiati quali: la ricerca di codi
APA, Harvard, Vancouver, ISO, and other styles
7

Smith, Nathanael J. "Novel Closed-Loop Matching Network Topology for Reconfigurable Antenna Applications." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1387733249.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pitaro, Raffaele. "McGiver: Module Classifier using fine tuning Machine Learning techniques." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019.

Find full text
Abstract:
La classificazione automatizzata di documenti digitalizzati in categorie predefinite ha sollevato un grande interesse fin dagli anni 2000. Questo è dovuto al sensibile aumento di documenti in formato digitale ed alla crescente necessità di dar loro un’organizzazione gerarchica. Inoltre, principalmente a causa della grande mole di documenti da categorizzare, negli ultimi anni si richiede che tale compito venga gestito in modo automatizzato. In ambito aziendale, queste problematiche vengono spesso affrontate mediante l’utilizzo di soluzioni “black box” proprietarie. Tali soluzioni si rivelano
APA, Harvard, Vancouver, ISO, and other styles
9

Alghafir, Mohammed Najib. "Neural network modelling of automotive dampers for variable temperature operation and suspension system tuning." Thesis, University of Sussex, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.496859.

Full text
Abstract:
This thesis focuses on modelling of passive hydraulic automotive dampers for use in computationally-fast vehicle-dynamic simulation. An extended version of the Duym and Reybrouck 1998 physical model is examined to enable work with high frequency input displacements. This computationally-expensive model is verified with real damper data under both isothermal and variable temperature regular, and random (Pave) input displacement conditions. Initially the extension includes just additional input kinematics to account for inertial effects, with an imposed temperature profile. Subsequently a heat g
APA, Harvard, Vancouver, ISO, and other styles
10

Zheng, Cong. "Loosely Coupled Transformer and Tuning Network Design for High-Efficiency Inductive Power Transfer Systems." Diss., Virginia Tech, 2015. http://hdl.handle.net/10919/52893.

Full text
Abstract:
Transfer signal without wire has been widely accepted after the introduction of cellular technology and WiFi technology, hence the power cable is the last wire that has yet to be eliminated. Inductive power transfer (IPT) has drawn substantial interest in both academia and industry due to its advantages including convenience, nonexistence of cable and connector, no electric shock issue, ability to work under some extreme environment, and so on. After performing thorough literature review of IPT systems, two major drawbacks including low power efficiency and coil displacement sensitivity are id
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Network tuning"

1

Dolson, David C. Attentive object recognition in the selective tuning network. University of Toronto, Dept. of Computer Science, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Miscovich, Gina. The SCO performance tuning handbook. PTR Prentice Hall, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

International Business Machines Corporation. International Technical Support Organization., ed. DS8000 performance monitoring and tuning. IBM, International Technical Support Organization, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

International Business Machines Corporation. International Technical Support Organization., ed. DS8000 performance monitoring and tuning. IBM, International Technical Support Organization, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

International Business Machines Corporation. International Technical Support Organization., ed. DS8000 performance monitoring and tuning. IBM, International Technical Support Organization, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Held, Gilbert. LAN testing and troubleshooting: Reliability tuning techniques. Wiley, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Loukides, Michael Kosta. System performance tuning. O'Reilly & Associates, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Cathy, Warrick, and International Business Machines Corporation. International Technical Support Organization., eds. IBM TotalStorage DS8000 series: Performance monitoring and tuning. IBM, International Technical Support Organization, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bertrand, Dufrasne, and International Business Machines Corporation. International Technical Support Organization, eds. DS4000 best practices and performance tuning guide. 3rd ed. IBM, International Technical Support Organization, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Borri, Claudio, and Francesco Maffioli, eds. Re-engineering Engineering Education in Europe. Firenze University Press, 2008. http://dx.doi.org/10.36253/978-88-8453-676-1.

Full text
Abstract:
Contributing to the development and the enrichment of the European dimension in Engineering Education (EE), constituted the global goal of TREE. In other words to enhance the compatibility of the many diverse routes to the status of Professional Engineer which exist in Europe and, hence, to facilitate greater mobility of skilled personnel and integration of the various situations throughout Europe. The activity of the TN TREE, made up by some 110 higher education Institutions and Associations, has been developed along four main lines: A. the tuning line B. the education and research line C. th
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Network tuning"

1

Ioualalen, Arnault, and Matthieu Martel. "Neural Network Precision Tuning." In Quantitative Evaluation of Systems. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-30281-8_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gu, Qizheng. "Matching Network Tuning and Control Methods." In RF Tunable Devices and Subsystems: Methods of Modeling, Analysis, and Applications. Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-09924-8_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Laros III, James H., Kevin Pedretti, Suzanne M. Kelly, et al. "Network Bandwidth Tuning During Application Runtime." In Energy-Efficient High Performance Computing. Springer London, 2012. http://dx.doi.org/10.1007/978-1-4471-4492-2_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bartz-Beielstein, Thomas, and Martin Zaefferer. "Hyperparameter Tuning Approaches." In Hyperparameter Tuning for Machine and Deep Learning with R. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-19-5170-1_4.

Full text
Abstract:
AbstractThis chapter provides a broad overview over the different hyperparameter tunings. It details the process of HPT, and discusses popular HPT approaches and difficulties. It focuses on surrogate optimization, because this is the most powerful approach. It introduces Sequential Parameter Optimization Toolbox (SPOT) as one typical surrogate method. SPOT is well established and maintained, open source, available on Comprehensive R Archive Network (CRAN), and catches mistakes. Because SPOT is open source and well documented, the human remains in the loop of decision-making. The introduction o
APA, Harvard, Vancouver, ISO, and other styles
5

Farrugia, Michael, Neil Hurley, and Aaron Quigley. "SAINT: Supervised Actor Identification for Network Tuning." In Lecture Notes in Social Networks. Springer Netherlands, 2013. http://dx.doi.org/10.1007/978-94-007-6359-3_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Potočnik, P., and I. Grabec. "Adaptive Self-Tuning Neural-Network-Based Controller." In Computational Intelligence in Systems and Control Design and Applications. Springer Netherlands, 2000. http://dx.doi.org/10.1007/978-94-010-9040-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ferro, Quentin, Stef Graillat, Thibault Hilaire, Fabienne Jézéquel, and Basile Lewandowski. "Neural Network Precision Tuning Using Stochastic Arithmetic." In Lecture Notes in Computer Science. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-21222-2_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Ashouri, Amir H., Gianluca Palermo, John Cavazos, and Cristina Silvano. "Selecting the Best Compiler Optimizations: A Bayesian Network Approach." In Automatic Tuning of Compilers Using Machine Learning. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-71489-9_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bolt, Janneke H., and Linda C. van der Gaag. "Balanced Tuning of Multi-dimensional Bayesian Network Classifiers." In Lecture Notes in Computer Science. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-20807-7_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Michalski, Jerzy Julian, Jacek Gulgowski, Tomasz Kacmajor, and Mateusz Mazur. "Artificial Neural Network in Microwave Cavity Filter Tuning." In Microwave and Millimeter Wave Circuits and Systems. John Wiley & Sons, Ltd, 2012. http://dx.doi.org/10.1002/9781118405864.ch2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Network tuning"

1

Cottis, R. A., W. Bogaerts, and Z. Diamantidis. "Tuning the Internet for Corrosion." In CORROSION 1999. NACE International, 1999. https://doi.org/10.5006/c1999-99239.

Full text
Abstract:
Abstract The Internet provides a powerful, content-neutral mechanism for the presentation and delivery of information, but it has limitations for corrosion applications. These include the difficulty of defining corrosion specific information and the inability to accommodate commercial information. The Thematic Network described here is an open group project that aims to establish mechanisms for the indexing and exchange of corrosion information over the Internet.
APA, Harvard, Vancouver, ISO, and other styles
2

Zaid, Hussein, Pooyan Safari, Behnam Shariati, Aydin Jafari, Mihail Balanici, and Johannes Karl Fischer. "Multi-Agent Design for LLM-assisted Network Management." In Optical Fiber Communication Conference. Optica Publishing Group, 2025. https://doi.org/10.1364/ofc.2025.w1a.4.

Full text
Abstract:
We propose a network automation solution using pre-trained LLMs and advanced prompt engineering that ensures NDA confidentiality compliance. Our method achieves a 96.7% success rate in executing user intent into network operations without model fine-tuning.
APA, Harvard, Vancouver, ISO, and other styles
3

Toribio, David Moses B., Robert Y. Pascual, and Adomar L. Ilao. "Tuning Prediction Model Performance using Neural Network Algorithm Learning Parameters." In 2024 22nd International Conference on ICT and Knowledge Engineering (ICT&KE). IEEE, 2024. https://doi.org/10.1109/ictke62841.2024.10787197.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rehan, Ahmed, Igor Boiko, and Yahya Zweiri. "PID Tuning Using Neural Network Classification of Self-Excited Oscillations." In 2024 17th International Workshop on Variable Structure Systems (VSS). IEEE, 2024. http://dx.doi.org/10.1109/vss61690.2024.10753381.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Suetterlein, Joshua, Stephen J. Young, Jesun Firoz, et al. "HPC Network Simulation Tuning via Automatic Extraction of Hardware Parameters." In 2024 IEEE High Performance Extreme Computing Conference (HPEC). IEEE, 2024. https://doi.org/10.1109/hpec62836.2024.10938439.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Jia, Gan Liu, and Junzhong Ji. "Pre-Training and Fine-Tuning Transformer for Brain Network Classification." In 2024 IEEE International Conference on Medical Artificial Intelligence (MedAI). IEEE, 2024. https://doi.org/10.1109/medai62885.2024.00025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Guo, Hong, Nianhui Guo, Christoph Meinel, and Haojin Yang. "Low-bit CUTLASS GEMM Template Auto-tuning using Neural Network." In 2024 IEEE International Symposium on Parallel and Distributed Processing with Applications (ISPA). IEEE, 2024. https://doi.org/10.1109/ispa63168.2024.00057.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Patel, Sourav, Supriya Bharat Wakchaure, Ayush Kumar Sharma, and Abirami S. "Enhanced Convolution Neural Network with Optimized Pooling and Hyperparameter Tuning for Network Intrusion Detection." In 2024 IEEE Silchar Subsection Conference (SILCON). IEEE, 2024. https://doi.org/10.1109/silcon63976.2024.10910612.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chen, Ziteng, Menghao Zhang, Guanyu Li, et al. "Paraleon: Automatic and Adaptive Tuning for DCQCN Parameters in RDMA Networks." In 2024 IEEE 32nd International Conference on Network Protocols (ICNP). IEEE, 2024. https://doi.org/10.1109/icnp61940.2024.10858517.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Xu, Runxin, Fuli Luo, Baobao Chang, Songfang Huang, and Fei Huang. "S4-Tuning: A Simple Cross-lingual Sub-network Tuning Method-Tuning: A Simple Cross-lingual Sub-network Tuning Method." In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.acl-short.58.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Network tuning"

1

Downard, Alicia, Stephen Semmens, and Bryant Robbins. Automated characterization of ridge-swale patterns along the Mississippi River. Engineer Research and Development Center (U.S.), 2021. http://dx.doi.org/10.21079/11681/40439.

Full text
Abstract:
The orientation of constructed levee embankments relative to alluvial swales is a useful measure for identifying regions susceptible to backward erosion piping (BEP). This research was conducted to create an automated, efficient process to classify patterns and orientations of swales within the Lower Mississippi Valley (LMV) to support levee risk assessments. Two machine learning algorithms are used to train the classification models: a convolutional neural network and a U-net. The resulting workflow can identify linear topographic features but is unable to reliably differentiate swales from o
APA, Harvard, Vancouver, ISO, and other styles
2

Lenz, Lutz. Automatic Tuning of Integrated Filters Using Neural Networks. Portland State University Library, 2000. http://dx.doi.org/10.15760/etd.6488.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Caixia, and Andrew Coulson. Fine Tuning MobileNet Neural Networks for Oil Spill Detection. Purdue University, 2023. http://dx.doi.org/10.5703/1288284317671.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lai, Chin-Ta, and Joel Conte. Dynamic Modeling of the UC San Diego NHERI Six-Degree-of-Freedom Large High-Performance Outdoor Shake Table. Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, 2024. http://dx.doi.org/10.55461/jsds5228.

Full text
Abstract:
The UC San Diego Large High-Performance Outdoor Shake Table (LHPOST), which was commissioned on October 1, 2004 as a shared-use experimental facility of the National Science Foundation (NSF) Network for Earthquake Engineering Simulation (NEES) program, was upgraded from its original one degree-of-freedom (LHPOST) to a six degree-of-freedom configuration (LHPOST6) between October 2019 and April 2022. The LHPOST6 is a shared-use experimental facility of the NSF Natural Hazard Engineering Research Infrastructure (NHERI) program. A mechanics-based numerical model of the LHPOST6 able to capture the
APA, Harvard, Vancouver, ISO, and other styles
5

Asaeda, H., H. Liu, and Q. Wu. Tuning the Behavior of the Internet Group Management Protocol (IGMP) and Multicast Listener Discovery (MLD) for Routers in Mobile and Wireless Networks. RFC Editor, 2012. http://dx.doi.org/10.17487/rfc6636.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Liu, Tairan. Addressing Urban Traffic Congestion: A Deep Reinforcement Learning-Based Approach. Mineta Transportation Institute, 2025. https://doi.org/10.31979/mti.2025.2322.

Full text
Abstract:
In an innovative venture, the research team embarked on a mission to redefine urban traffic flow by introducing an automated way to manage traffic light timings. This project integrates two critical technologies, Deep Q-Networks (DQN) and Auto-encoders, into reinforcement learning, with the goal of making traffic smoother and reducing the all-too-common road congestion in simulated city environments. Deep Q-Networks (DQN) are a form of reinforcement learning algorithms that learns the best actions to take in various situations through trial and error. Auto-encoders, on the other hand, are tool
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!