Academic literature on the topic 'Adam optimization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Adam optimization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Adam optimization"

1

Shao, Yichuan, Jiantao Wang, Haijing Sun, et al. "An Improved BGE-Adam Optimization Algorithm Based on Entropy Weighting and Adaptive Gradient Strategy." Symmetry 16, no. 5 (2024): 623. http://dx.doi.org/10.3390/sym16050623.

Full text
Abstract:
This paper introduces an enhanced variant of the Adam optimizer—the BGE-Adam optimization algorithm—that integrates three innovative technologies to augment the adaptability, convergence, and robustness of the original algorithm under various training conditions. Firstly, the BGE-Adam algorithm incorporates a dynamic β parameter adjustment mechanism that utilizes the rate of gradient variations to dynamically adjust the exponential decay rates of the first and second moment estimates (β1 and β2), the adjustment of β1 and β2 is symmetrical, which means that the rules that the algorithm consider
APA, Harvard, Vancouver, ISO, and other styles
2

Shao, Yichuan, Jiapeng Yang, Wen Zhou, et al. "An Improvement of Adam Based on a Cyclic Exponential Decay Learning Rate and Gradient Norm Constraints." Electronics 13, no. 9 (2024): 1778. http://dx.doi.org/10.3390/electronics13091778.

Full text
Abstract:
Aiming at a series of limitations of the Adam algorithm, such as hyperparameter sensitivity and unstable convergence, in this paper, an improved optimization algorithm, the Cycle-Norm-Adam (CN-Adam) algorithm, is proposed. The algorithm integrates the ideas of a cyclic exponential decay learning rate (CEDLR) and gradient paradigm constraintsand accelerates the convergence speed of the Adam model and improves its generalization performance by dynamically adjusting the learning rate. In order to verify the effectiveness of the CN-Adam algorithm, we conducted extensive experimental studies. The C
APA, Harvard, Vancouver, ISO, and other styles
3

Jais, Imran Khan Mohd, Amelia Ritahani Ismail, and Syed Qamrun Nisa. "Adam Optimization Algorithm for Wide and Deep Neural Network." Knowledge Engineering and Data Science 2, no. 1 (2019): 41. http://dx.doi.org/10.17977/um018v2i12019p41-46.

Full text
Abstract:
The objective of this research is to evaluate the effects of Adam when used together with a wide and deep neural network. The dataset used was a diagnostic breast cancer dataset taken from UCI Machine Learning. Then, the dataset was fed into a conventional neural network for a benchmark test. Afterwards, the dataset was fed into the wide and deep neural network with and without Adam. It was found that there were improvements in the result of the wide and deep network with Adam. In conclusion, Adam is able to improve the performance of a wide and deep neural network.
APA, Harvard, Vancouver, ISO, and other styles
4

Eren Dio Sefrila, Basuki Rahmat, and Andreas Nugroho Sihananto. "Implementasi Arsitektur Inception V3 Dengan Optimasi Adam, SGD dan RMSP Pada Klasifikasi Penyakit Malaria." Bridge : Jurnal publikasi Sistem Informasi dan Telekomunikasi 2, no. 2 (2024): 69–84. http://dx.doi.org/10.62951/bridge.v2i2.62.

Full text
Abstract:
In the current era of technological advancement, deep learning has become a widely discussed and utilized topic, particularly in image classification, object detection, and natural language processing. A significant development in deep learning is the Convolutional Neural Network (CNN), which is enhanced with various optimizations such as Adam, RMSProp, and SGD. This thesis implements the Inception v3 architecture for the deep learning model, utilizing these three optimization methods to classify malaria disease. The study aims to evaluate performance and determine the best optimization based
APA, Harvard, Vancouver, ISO, and other styles
5

Ma, Jerry, and Denis Yarats. "On the Adequacy of Untuned Warmup for Adaptive Optimization." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 10 (2021): 8828–36. http://dx.doi.org/10.1609/aaai.v35i10.17069.

Full text
Abstract:
Adaptive optimization algorithms such as Adam (Kingma and Ba, 2014) are widely used in deep learning. The stability of such algorithms is often improved with a warmup schedule for the learning rate. Motivated by the difficulty of choosing and tuning warmup schedules, recent work proposes automatic variance rectification of Adam's adaptive learning rate, claiming that this rectified approach ("RAdam") surpasses the vanilla Adam algorithm and reduces the need for expensive tuning of Adam with warmup. In this work, we refute this analysis and provide an alternative explanation for the necessity o
APA, Harvard, Vancouver, ISO, and other styles
6

Kamble, Arvind, and Virendra S. Malemath. "Adam Improved Rider Optimization-Based Deep Recurrent Neural Network for the Intrusion Detection in Cyber Physical Systems." International Journal of Swarm Intelligence Research 13, no. 3 (2022): 1–22. http://dx.doi.org/10.4018/ijsir.304402.

Full text
Abstract:
This paper designed the intrusion detection systems for determining the intrusions. Here, Adam Improved rider optimization approach (Adam IROA) is newly developed for detecting the intrusion in intrusion detection. Accordingly, the training of DeepRNN is done by proposed Adam IROA, which is designed by combining the Adam optimization algorithm with IROA. Thus, the newly developed Adam IROA is applied for intrusion detection. Overall, two phases are included in the proposed intrusion detection system, which involves feature selection and classification. Here, the features selection is done usin
APA, Harvard, Vancouver, ISO, and other styles
7

Su, Stephanie S. W., and Sie Long Kek. "An Improvement of Stochastic Gradient Descent Approach for Mean-Variance Portfolio Optimization Problem." Journal of Mathematics 2021 (March 25, 2021): 1–10. http://dx.doi.org/10.1155/2021/8892636.

Full text
Abstract:
In this paper, the current variant technique of the stochastic gradient descent (SGD) approach, namely, the adaptive moment estimation (Adam) approach, is improved by adding the standard error in the updating rule. The aim is to fasten the convergence rate of the Adam algorithm. This improvement is termed as Adam with standard error (AdamSE) algorithm. On the other hand, the mean-variance portfolio optimization model is formulated from the historical data of the rate of return of the S&P 500 stock, 10-year Treasury bond, and money market. The application of SGD, Adam, adaptive moment estim
APA, Harvard, Vancouver, ISO, and other styles
8

Irfan, Desi, Teddy Surya Gunawan, and Wanayumini Wanayumini. "Comparison Of SGD, Rmsprop, and Adam Optimation In Animal Classification Using CNNs." International Conference on Information Science and Technology Innovation (ICoSTEC) 2, no. 1 (2023): 45–51. http://dx.doi.org/10.35842/icostec.v2i1.35.

Full text
Abstract:
Many measures have been taken to protect endangered species by using "camera trap" technology which is widespread in the field of technology-based nature protection field research. In this study, a machine learning-based approach is presented to identify endangered wildlife images with a data set containing 5000 images taken from Kaggle and some other sources. The Gradient Descent optimization method is often used for Artificial Neural Network (ANN) training. This method plays a role in finding the weight values that give the best output value. Three optimization methods have been implemented,
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Yijun, Pengyu Zhou, and Wenya Zhong. "An Optimization Strategy Based on Hybrid Algorithm of Adam and SGD." MATEC Web of Conferences 232 (2018): 03007. http://dx.doi.org/10.1051/matecconf/201823203007.

Full text
Abstract:
Despite superior training outcomes, adaptive optimization methods such as Adam, Adagrad or RMSprop have been found to generalize poorly compared to stochastic gradient descent (SGD). So scholars (Nitish Shirish Keskar et al., 2017) proposed a hybrid strategy to start training with Adam and switch to SGD at the right time. In the learning task with a large output space, it was observed that Adam could not converge to an optimal solution (or could not converge to an extreme point in a non-convex scene) [1]. Therefore, this paper proposes a new variant of the ADAM algorithm (AMSGRAD), which not o
APA, Harvard, Vancouver, ISO, and other styles
10

Zhang, Can, Yichuan Shao, Haijing Sun, Lei Xing, Qian Zhao, and Le Zhang. "The WuC-Adam algorithm based on joint improvement of Warmup and cosine annealing algorithms." Mathematical Biosciences and Engineering 21, no. 1 (2023): 1270–85. http://dx.doi.org/10.3934/mbe.2024054.

Full text
Abstract:
<abstract> <p>The Adam algorithm is a common choice for optimizing neural network models. However, its application often brings challenges, such as susceptibility to local optima, overfitting and convergence problems caused by unstable learning rate behavior. In this article, we introduce an enhanced Adam optimization algorithm that integrates Warmup and cosine annealing techniques to alleviate these challenges. By integrating preheating technology into traditional Adam algorithms, we systematically improved the learning rate during the initial training phase, effectively avoiding
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Adam optimization"

1

Bergdahl, Joakim. "Asynchronous Advantage Actor-Critic with Adam Optimization and a Layer Normalized Recurrent Network." Thesis, KTH, Optimeringslära och systemteori, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-220698.

Full text
Abstract:
State-of-the-art deep reinforcement learning models rely on asynchronous training using multiple learner agents and their collective updates to a central neural network. In this thesis, one of the most recent asynchronous policy gradientbased reinforcement learning methods, i.e. asynchronous advantage actor-critic (A3C), will be examined as well as improved using prior research from the machine learning community. With application of the Adam optimization method and addition of a long short-term memory (LSTM) with layer normalization, it is shown that the performance of A3C is increased.<br>Mo
APA, Harvard, Vancouver, ISO, and other styles
2

Baumgärtner, Nils Julius [Verfasser], André [Akademischer Betreuer] Bardow, Hans-Jürgen [Akademischer Betreuer] Koß, and Adam [Akademischer Betreuer] Brandt. "Optimization of low-carbon energy systems from industrial to national scale / Nils Julius Baumgärtner ; André Bardow, Hans-Jürgen Koß, Adam Brandt." Aachen : Universitätsbibliothek der RWTH Aachen, 2020. http://d-nb.info/122994253X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Engel, Anja Jeannine Verfasser], Gerhard [Akademischer Betreuer] [Thiel, and Adam [Akademischer Betreuer] Bertl. "Potassium channel-based optogenetic tool development - Establishment and optimization of compartment-specific light-inducible silencing tools / Anja Jeannine Engel ; Gerhard Thiel, Adam Bertl." Darmstadt : Universitäts- und Landesbibliothek Darmstadt, 2020. http://d-nb.info/1218231963/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Engel, Anja Jeannine [Verfasser], Gerhard [Akademischer Betreuer] Thiel, and Adam [Akademischer Betreuer] Bertl. "Potassium channel-based optogenetic tool development - Establishment and optimization of compartment-specific light-inducible silencing tools / Anja Jeannine Engel ; Gerhard Thiel, Adam Bertl." Darmstadt : Universitäts- und Landesbibliothek Darmstadt, 2020. http://d-nb.info/1218231963/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Annergren, Mariette. "Application-Oriented Input Design and Optimization Methods Involving ADMM." Doctoral thesis, KTH, Reglerteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-187890.

Full text
Abstract:
This thesis is divided into two main parts. The first part considers application-oriented input design, specifically for model predictive control (MPC). The second part considers alternating direction method of multipliers (ADMM) for ℓ1 regularized optimization problems and primal-dual interior-point methods. The theory of system identification provides methods for estimating models of dynamical systems from experimental data. This thesis is focused on identifying models used for control, with special attention to MPC. The objective is to minimize the cost of the identification experiment whil
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Shijian. "Scalable User Assignment in Power Grids: A Data Driven Approach." Digital WPI, 2017. https://digitalcommons.wpi.edu/etd-theses/1107.

Full text
Abstract:
"The fast pace of global urbanization is drastically changing the population distributions over the world, which leads to significant changes in geographical population densities. Such changes in turn alter the underlying geographical power demand over time, and drive power substations to become over-supplied (demand ≪ capacity) or under-supplied (demand ≈ capacity). In this work, we make the first attempt to investigate the problem of power substation/user assignment by analyzing large scale power grid data. We develop a Scalable Power User Assignment (SPUA) framework, that takes large-scale
APA, Harvard, Vancouver, ISO, and other styles
7

Zeng, Shangzhi. "Algorithm-tailored error bound conditions and the linear convergence rae of ADMM." HKBU Institutional Repository, 2017. https://repository.hkbu.edu.hk/etd_oa/474.

Full text
Abstract:
In the literature, error bound conditions have been widely used for studying the linear convergence rates of various first-order algorithms and the majority of literature focuses on how to sufficiently ensure these error bound conditions, usually posing more assumptions on the model under discussion. In this thesis, we focus on the alternating direction method of multipliers (ADMM), and show that the known error bound conditions for studying ADMM's linear convergence, can indeed be further weakened if the error bound is studied over the specific iterative sequence generated by ADMM. A so-calle
APA, Harvard, Vancouver, ISO, and other styles
8

Ghadimi, Euhanna. "Accelerating Convergence of Large-scale Optimization Algorithms." Doctoral thesis, KTH, Reglerteknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-162377.

Full text
Abstract:
Several recent engineering applications in multi-agent systems, communication networks, and machine learning deal with decision problems that can be formulated as optimization problems. For many of these problems, new constraints limit the usefulness of traditional optimization algorithms. In some cases, the problem size is much larger than what can be conveniently dealt with using standard solvers. In other cases, the problems have to be solved in a distributed manner by several decision-makers with limited computational and communication resources. By exploiting problem structure, however, i
APA, Harvard, Vancouver, ISO, and other styles
9

Annergren, Mariette. "ADMM for l1 Regularized Optimization Problems and Applications Oriented Input Design for MPC." Licentiate thesis, KTH, Reglerteknik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-103618.

Full text
Abstract:
This licentiate thesis is divided into two main parts. The first part considers alternating direction method of multipliers (ADMM) for ℓ1 regularized optimization problems and the second part considers applications oriented input design for model predictive control (MPC). Many important problems in science and engineering can be formulated as convex optimization problems. As such, they have a unique solution and there exist very efficient algorithms for finding the solution. We are interested in methods that can handle big, in terms of the number of variables, optimization problems in an effic
APA, Harvard, Vancouver, ISO, and other styles
10

Vandi, Damiano. "ADAS Value Optimization for Rear Park Assist: Improvement and Assessment of Sensor Fusion Strategy." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.

Find full text
Abstract:
The project for this thesis consists in an ADAS Value Optimization activity conducted during an internship in Maserati S.p.A. with the objective of removing the ultrasonic sensors used for the Rear Park Assist (RPA) ADAS feature, obtaining the same functionality and performance in the detection and signaling of obstacles behind the car through a new system based on a sensor fusion strategy between Rear View Camera (RVC) and Blind Spot Radars (BSD). To achieve this goal, a study of the current RPA feature has been conducted, and starting from a previous implementation of the sensor fusion stra
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Adam optimization"

1

Bhandari, Mohan, Pramod Parajuli, Pralhad Chapagain, and Loveleen Gaur. "Evaluating Performance of Adam Optimization by Proposing Energy Index." In Communications in Computer and Information Science. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-07005-1_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

He, Zhexin, and Huan Zhang. "A Method of Concrete Surface Crack Detection Using an Improved Convolutional Neural Network (CNN) Model." In Novel Technology and Whole-Process Management in Prefabricated Building. Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-5108-2_36.

Full text
Abstract:
AbstractThis essay spotlights concrete crack detection in infrastructure maintenance, highlighting its importance for structural integrity, cost-effectiveness, and eco-consciousness. It delves into various detection methods and introduces an improved VGG-16-based deep learning model with batch normalization, P-ReLU activation, and Adam optimization for better training outcomes. Through experiments on the MendeleyData-CrackDetection dataset, the enhanced model outperforms the original. This study underscores the significance of hyperparameter optimization and algorithm choice in deep learning.
APA, Harvard, Vancouver, ISO, and other styles
3

Arora, Simrann, Akash Gupta, Rachna Jain, and Anand Nayyar. "Optimization of the CNN Model for Hand Sign Language Recognition Using Adam Optimization Technique." In Micro-Electronics and Telecommunication Engineering. Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-33-4687-1_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kuremoto, Takashi, Masafumi Furuya, Shingo Mabu, and Kunikazu Kobayashi. "A Time Series Forecasting Method Using DBN and Adam Optimization." In Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-29126-5_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shah, Saurabh A., Dinesh G. Harkut, and Sayali M. Thakre. "DLSTM with Adam Waterwheel Optimization for Groundwater Level Prediction in India." In Lecture Notes in Networks and Systems. Springer Nature Singapore, 2024. https://doi.org/10.1007/978-981-97-6992-6_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Yuting, Guojun Wang, Tao Peng, and Guanghui Feng. "FedTA: Locally-Differential Federated Learning with Top-k Mechanism and Adam Optimization." In Communications in Computer and Information Science. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-0272-9_26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Narengbam, Lenin, and Shouvik Dey. "Artificial Neural Networks and Enhanced Adam Optimization for Effective Wi-Fi Intrusion Detection." In Lecture Notes in Electrical Engineering. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-4713-3_36.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gautam, Vikas, and Banalaxmi Brahma. "Adam Lyrebird Optimization-Based DLSTM for Solar Irradiance Prediction Using Time Series Data." In Communications in Computer and Information Science. Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-79041-6_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Agrahari, Omkar, Vandana Dixit Kaushik, and Vinay Kumar Pathak. "Adam Wild Horse Optimization with QRNN for Academic Performance Prediction in a Blended Learning Model." In Lecture Notes in Networks and Systems. Springer Nature Singapore, 2025. https://doi.org/10.1007/978-981-96-3250-3_30.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Lin, Zhouchen, Huan Li, and Cong Fang. "ADMM for Nonconvex Optimization." In Alternating Direction Method of Multipliers for Machine Learning. Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9840-8_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Adam optimization"

1

P, Subha, and Senthil Pandi S. "Advanced Fracture Analysis with Adam Optimization Techniques." In 2024 4th International Conference on Computer, Communication, Control & Information Technology (C3IT). IEEE, 2024. https://doi.org/10.1109/c3it60531.2024.10829465.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Tao, Wanquan Xiong, Zhe Cheng, and Jiangpeng Zheng. "Optimization of 3D Object Detection Based on Adam." In 2024 International Conference on Image Processing, Computer Vision and Machine Learning (ICICML). IEEE, 2024. https://doi.org/10.1109/icicml63543.2024.10958129.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Nabila, Putri, and Erwin Budi Setiawan. "Adam and AdamW Optimization Algorithm Application on BERT Model for Hate Speech Detection on Twitter." In 2024 International Conference on Data Science and Its Applications (ICoDSA). IEEE, 2024. http://dx.doi.org/10.1109/icodsa62899.2024.10651619.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Das, Abhrodeep, Animesh Hazra, Souhardya Dutta, Indrajit Das, Sukal Roy, and Nibaran Das. "GWABHO-Adam: Grey Wolf Algorithm Based Hyperparameter Optimization of Adam to Enhance Cancer Classification from Microscopic Images." In 2024 IEEE 16th International Conference on Computational Intelligence and Communication Networks (CICN). IEEE, 2024. https://doi.org/10.1109/cicn63059.2024.10847452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhao, Jianlong, Bangjia Hu, Jun Dong, and Haoyu Wang. "Threshold Shrinkage and Adam Alternating Optimization for Measurement Matrix." In 2024 6th International Conference on Natural Language Processing (ICNLP). IEEE, 2024. http://dx.doi.org/10.1109/icnlp60986.2024.10692959.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Huang, Zhixuan, and Xiaonan Luo. "Integrating the dynamic optimization strategies of Adam and AdaGrad." In 2025 13th International Conference on Intelligent Control and Information Processing (ICICIP). IEEE, 2025. https://doi.org/10.1109/icicip64458.2025.10898148.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pan, Zhiyao, Zhongyi Chang, Ting Li, and Shaofu Yang. "Distributed Optimization via Mixing ADAM and Quasi-Newton Methods." In 2024 International Conference on Neuromorphic Computing (ICNC). IEEE, 2024. https://doi.org/10.1109/icnc64304.2024.10987794.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Agarwal, Muskan, Kanwarpartap Singh Gill, Sonal Malhotra, and Swati Devliyali. "Exploring Numerical Analysis with Sequential Convolutional Neural Networks Leveraging Adam Optimization Technique." In 2024 International Conference on Innovations and Challenges in Emerging Technologies (ICICET). IEEE, 2024. http://dx.doi.org/10.1109/icicet59348.2024.10616317.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Reddy, B. Bhaskar, Doaa Saadi Kareem, Ahmed Read Al-Tameemi, Palavalasa Tejesh, Rerupalli Naveen, and Gundeti SaiTeja. "Single Image Dehazing based on Multi-Scale Feature Fusion Under Adam-Optimization." In 2024 International Conference on Augmented Reality, Intelligent Systems, and Industrial Automation (ARIIA). IEEE, 2024. https://doi.org/10.1109/ariia63345.2024.11051934.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gill, Kanwarpartap Singh, Avinash Sharma, Vatsala Anand, Rupesh Gupta, and Pratibha Deshmukh. "Expression of Concern for: Influence of Adam Optimizer with Sequential Convolutional Model for Detection of Tuberculosis." In 2022 International Conference on Computational Modelling, Simulation and Optimization (ICCMSO). IEEE, 2022. http://dx.doi.org/10.1109/iccmso58359.2022.10703649.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Adam optimization"

1

Jullig, Richard, and Y. V. Srinivas. Performance Optimization in ADA. Defense Technical Information Center, 1995. http://dx.doi.org/10.21236/ada305759.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

ADA JOINT PROGRAM OFFICE ARLINGTON VA. Ada Compiler Validation Summary Report: Certificate Number: 910612W1. 11168 Telesoft, IBM Ada/370, Version 1.2.0 (without Optimization) IBM 3080, V</ SP HPO Rel 5.0 (Unopt) (Host & Target). Defense Technical Information Center, 1991. http://dx.doi.org/10.21236/ada246532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!