To see the other types of publications on this topic, follow the link: Decentralized federated learning.

Journal articles on the topic 'Decentralized federated learning'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Decentralized federated learning.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Jiang, Jingyan, Liang Hu, Chenghao Hu, Jiate Liu, and Zhi Wang. "BACombo—Bandwidth-Aware Decentralized Federated Learning." Electronics 9, no. 3 (2020): 440. http://dx.doi.org/10.3390/electronics9030440.

Full text
Abstract:
The emerging concern about data privacy and security has motivated the proposal of federated learning. Federated learning allows computing nodes to only synchronize the locally- trained models instead of their original data in distributed training. Conventional federated learning architecture, inherited from the parameter server design, relies on highly centralized typologies and large nodes-to-server bandwidths. However, in real-world federated learning scenarios, the network capacities between nodes are highly uniformly distributed and smaller than that in data centers. As a result, how to e
APA, Harvard, Vancouver, ISO, and other styles
2

Srinivasa Rao, Angajala. "Unifying Intelligence: Federated Learning in Cloud Environments for Decentralized Machine Learning." International Journal of Science and Research (IJSR) 12, no. 12 (2023): 997–99. http://dx.doi.org/10.21275/sr231212134726.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bonawitz, Kallista, Peter Kairouz, Brendan Mcmahan, and Daniel Ramage. "Federated learning and privacy." Communications of the ACM 65, no. 4 (2022): 90–97. http://dx.doi.org/10.1145/3500240.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Jitendra Singh Chouhan, Amit Kumar Bhatt, Nitin Anand. "Federated Learning; Privacy Preserving Machine Learning for Decentralized Data." Tuijin Jishu/Journal of Propulsion Technology 44, no. 1 (2023): 167–69. http://dx.doi.org/10.52783/tjjpt.v44.i1.2234.

Full text
Abstract:

 
 
 
 Federated learning represents a compelling solution for tackling the privacy challenges inherent in decentralized and distributed environments when it comes to machine learning. This scholarly paper delves deep into the realm of federated learning, encompassing its applications and the latest privacy-preserving techniques used for training machine learning models in a decentralized manner. We explore the reasons behind the adoption of federated learning, highlight its advantages over conventional centralized approaches, and examine the diverse methods employed to sa
APA, Harvard, Vancouver, ISO, and other styles
5

Woo, Gimoon, Hyungbin Kim, Seunghyun Park, Cheolwoo You, and Hyunhee Park. "Fairness-Based Multi-AP Coordination Using Federated Learning in Wi-Fi 7." Sensors 22, no. 24 (2022): 9776. http://dx.doi.org/10.3390/s22249776.

Full text
Abstract:
Federated learning is a type of distributed machine learning in which models learn by using large-scale decentralized data between servers and devices. In a short-range wireless communication environment, it can be difficult to apply federated learning because the number of devices in one access point (AP) is small, which can be small enough to perform federated learning. Therefore, it means that the minimum number of devices required to perform federated learning cannot be matched by the devices included in one AP environment. To do this, we propose to obtain a uniform global model regardless
APA, Harvard, Vancouver, ISO, and other styles
6

Ahmed Al-Banori and Layla Hassan. "Federated Learning: Decentralized Machine Learning for Privacy Preservation." International Journal of Emerging Trends in Information Technology (IJEIT) 1, no. 1 (2025): 61–72. https://doi.org/10.64056/n2nbva22.

Full text
Abstract:
Federated Learning (FL) is an emerging paradigm in machine learning that enables multiple devices or organizations to collaboratively train a global model without sharing raw data​. This decentralized approach addresses data silo and privacy challenges by ensuring sensitive information remains local to each participant. In this article, we survey recent advances in FL and propose a hypothetical experiment comparing federated and centralized learning on a standard dataset. Our experimental methodology employs the Federated Averaging algorithm​ across simulated clients and measures model accurac
APA, Harvard, Vancouver, ISO, and other styles
7

Ajay, Ajay, Ajay Kumar, Krishan Kant Singh Gautam, Pratibha Deshmukh, Pavithra G, and Laith Abualigah. "Collaborative Intelligence for IoT: Decentralized Net security and confidentiality." Journal of Intelligent Systems and Internet of Things 13, no. 2 (2024): 202–11. http://dx.doi.org/10.54216/jisiot.130216.

Full text
Abstract:
This research compares federated and centralized learning paradigms to discover the best machine learning privacy-model accuracy balance. Federated learning allows model training across devices or clients without data centralization. It's innovative distributed machine learning. Keeping data on individual devices reduces the hazards of centralized data storage, improving user privacy and security. However, centralized learning concentrates data on a server, which raises privacy and security problems. It evaluates two learning approaches using simulated data in a simple regression problem frame
APA, Harvard, Vancouver, ISO, and other styles
8

Monteiro, Daryn, Ishaan Mavinkurve, Parth Kambli, and Prof Sakshi Surve. "Federated Learning for Privacy-Preserving Machine Learning: Decentralized Model Training with Enhanced Data Security." International Journal for Research in Applied Science and Engineering Technology 12, no. 11 (2024): 355–61. http://dx.doi.org/10.22214/ijraset.2024.65062.

Full text
Abstract:
Abstract: Artificial Intelligence has found widespread use across various industries, from optimizing manufacturing workflows to diagnosing health conditions. However, the large volumes of data required to train AI models raise privacy concerns, especially when stored in centralized databases vulnerable to leaks. Federated Learning solves this problem by training models collaboratively by avoiding centralization of the sensitive data, preserving privacy while allowing decentralized models to be exported to edge devices. This paper explores Federated Learning, focusing on its technical aspects,
APA, Harvard, Vancouver, ISO, and other styles
9

Jiang, Changsong, Chunxiang Xu, Chenchen Cao, and Kefei Chen. "GAIN: Decentralized Privacy-Preserving Federated Learning." Journal of Information Security and Applications 78 (November 2023): 103615. http://dx.doi.org/10.1016/j.jisa.2023.103615.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Qi, Lu, Haoze Chen, Hongliang Zou, Shaohua Chen, Xiaoying Zhang, and Hongyan Chen. "Decentralized Federated Learning with Prototype Exchange." Mathematics 13, no. 2 (2025): 237. https://doi.org/10.3390/math13020237.

Full text
Abstract:
As AI applications become increasingly integrated into daily life, protecting user privacy while enabling collaborative model training has become a crucial challenge, especially in decentralized edge computing environments. Traditional federated learning (FL) approaches, which rely on centralized model aggregation, struggle in such settings due to bandwidth limitations, data heterogeneity, and varying device capabilities among edge nodes. To address these issues, we propose PearFL, a decentralized FL framework that enhances collaboration and model generalization by introducing prototype exchan
APA, Harvard, Vancouver, ISO, and other styles
11

Shi, Lingyu. "A survey on federated learning: evolution, applications and challenges." Applied and Computational Engineering 22, no. 1 (2023): 106–11. http://dx.doi.org/10.54254/2755-2721/22/20231177.

Full text
Abstract:
Federated learning, a machine learning technique that enables collaborative model training on decentralized data, has gained significant attention in recent years due to its potential to address privacy concerns. This paper explores the evolution, applications, and challenges of federated learning. The research topic focuses on providing a comprehensive understanding of federated learning, its advantages, and limitations. The purpose of the study is to highlight the importance of federated learning in preserving data privacy and enabling collaborative model training. The study conducted a lite
APA, Harvard, Vancouver, ISO, and other styles
12

Machová, Kristína, Marián Mach, and Viliam Balara. "Federated Learning in the Detection of Fake News Using Deep Learning as a Basic Method." Sensors 24, no. 11 (2024): 3590. http://dx.doi.org/10.3390/s24113590.

Full text
Abstract:
This article explores the possibilities for federated learning with a deep learning method as a basic approach to train detection models for fake news recognition. Federated learning is the key issue in this research because this kind of learning makes machine learning more secure by training models on decentralized data at decentralized places, for example, at different IoT edges. The data are not transformed between decentralized places, which means that personally identifiable data are not shared. This could increase the security of data from sensors in intelligent houses and medical device
APA, Harvard, Vancouver, ISO, and other styles
13

Myakala, Praveen Kumar, and Srikanth Kamatala. "Scalable Decentralized Multi-Agent Federated Reinforcement Learning: Challenges and Advances." International Journal of Electrical, Electronics and Computers 8, no. 6 (2023): 08–16. https://doi.org/10.22161/eec.86.2.

Full text
Abstract:
The increasing prevalence of decentralized multiagent systems has spurred interest in Federated Reinforcement Learning (FRL) as a privacy-preserving framework for collaborative learning. However, scaling FRL to multi-agent settings introduces significant challenges, particularly in communication efficiency, decentralized aggregation, and handling nonstationary environments. This survey explores recent advancements in Scalable Decentralized Multi-Agent Federated Reinforcement Learning (MA-FRL), with a focus on communication efficient strategies and decentralized aggregation techniques. We revie
APA, Harvard, Vancouver, ISO, and other styles
14

Pasupuleti, Murali Krishna. "Federated Deep Learning for Secure and Decentralized Model Training." International Journal of Academic and Industrial Research Innovations(IJAIRI) 05, no. 05 (2025): 415–23. https://doi.org/10.62311/nesx/rphcr3.

Full text
Abstract:
Abstract: Federated deep learning (FDL) is an emerging paradigm that enables multiple decentralized devices or institutions to collaboratively train a shared model while keeping data localized. This approach preserves privacy, reduces communication overhead, and complies with data governance regulations. In this paper, we explore the implementation and performance of FDL in real-world scenarios such as healthcare, finance, and IoT systems. Utilizing frameworks like TensorFlow Federated, PyTorch, and interpretability tools like SHAP and LIME, we evaluate FDL against centralized deep learning mo
APA, Harvard, Vancouver, ISO, and other styles
15

Guerra, Elia, Wilhelmi Francesc, Miozzo Marco, and Dini Paolo. "The Cost of Training Machine Learning Models Over Distributed Data Sources." IEEE Open Journal of the Communications Society 4 (May 9, 2023): 1111–26. https://doi.org/10.1109/OJCOMS.2023.3274394.

Full text
Abstract:
Federated learning is one of the most appealing alternatives to the standard centralized learning paradigm, allowing a heterogeneous set of devices to train a machine learning model without sharing their raw data. However, it requires a central server to coordinate the learning process, thus introducing potential scalability and security issues. In the literature, server-less federated learning approaches like gossip federated learning and blockchain-enabled federated learning have been proposed to mitigate these issues. In this work, we propose a complete overview of these three techniques, p
APA, Harvard, Vancouver, ISO, and other styles
16

Martínez Beltrán, Enrique Tomás, Ángel Luis Perales Gómez, Chao Feng, et al. "Fedstellar: A Platform for Decentralized Federated Learning." Expert Systems with Applications 242 (May 2024): 122861. http://dx.doi.org/10.1016/j.eswa.2023.122861.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Xu, Lei, Danya Xu, Xinlei Yi, Chao Deng, Tianyou Chai, and Tao Yang. "Decentralized Federated Learning Algorithm Under Adversary Eavesdropping." IEEE/CAA Journal of Automatica Sinica 12, no. 2 (2025): 448–56. https://doi.org/10.1109/jas.2024.125079.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Liu, Song, Xiong Wang, Longshuo Hui, and Weiguo Wu. "Blockchain-Based Decentralized Federated Learning Method in Edge Computing Environment." Applied Sciences 13, no. 3 (2023): 1677. http://dx.doi.org/10.3390/app13031677.

Full text
Abstract:
In recent years, federated learning has been able to provide an effective solution for data privacy protection, so it has been widely used in financial, medical, and other fields. However, traditional federated learning still suffers from single-point server failure, which is a frequent issue from the centralized server for global model aggregation. Additionally, it also lacks an incentive mechanism, which leads to the insufficient contribution of local devices to global model training. In this paper, we propose a blockchain-based decentralized federated learning method, named BD-FL, to solve
APA, Harvard, Vancouver, ISO, and other styles
19

Al-Tameemi, M., M. B. Hassan, and S. A. Abass. "Federated Learning (FL) – Overview." LETI Transactions on Electrical Engineering & Computer Science 17, no. 5 (2024): 74–82. http://dx.doi.org/10.32603/2071-8985-2024-17-5-74-82.

Full text
Abstract:
Explores the fundamental aspects of federated learning (FL) in the context of intrusion detection systems (IDS) within Internet of Things (IoT) networks. Federated learning presents an innovative approach to training machine learning models on distributed devices, thereby minimizing the need to transmit sensitive data to central servers. We classify FL into horizontal, vertical, and federated transfer learning and examine their application in IDS systems. Additionally, we analyze the network structure of FL, encompassing centralized and decentralized FL. Based on the conducted review, it can b
APA, Harvard, Vancouver, ISO, and other styles
20

Wang, Weixi. "Empowering safe and secure autonomy: Federated learning in the era of autonomous driving." Applied and Computational Engineering 51, no. 1 (2024): 40–44. http://dx.doi.org/10.54254/2755-2721/51/20241158.

Full text
Abstract:
Artificial Intelligence (AI) has a significant impact on empowering autonomous driving systems to perceive and interpret the environment effectively. However, ensuring data privacy and security in autonomous driving systems is a critical challenge. To surmount these hurdles, federated learning has emerged as an effective strategy. Federated learning is a decentralized machine learning approach that facilitates the cooperative training of models across a diverse set of connected devices, enabling them to collectively learn and improve their performance, while preserving data privacy. This appro
APA, Harvard, Vancouver, ISO, and other styles
21

Alardawi, Ahmed S., Ammar Odeh, Abobakr Aboshgifa, and Nabil Belhaj. "Challenges and Opportunities in Federated Learning." International Science and Technology Journal 35 (October 1, 2022): 1–15. http://dx.doi.org/10.62341/aana0837.

Full text
Abstract:
Federated Learning (FL) is a machine learning framework that allows collaborative model training across multiple decentralized edge devices or servers while keeping the data local and private. This method eliminates the need to exchange sensitive data, enhancing privacy and security. FL leverages the distributed nature of data generation for more effective learning. The paper explores the foundational concepts of FL, focusing on how it enables collective learning without centralizing individual data. This is particularly important in sectors where data privacy is critical, such as healthcare a
APA, Harvard, Vancouver, ISO, and other styles
22

Lee, Haeyun, Young Jun Chai, Hyunjin Joo, et al. "Federated Learning for Thyroid Ultrasound Image Analysis to Protect Personal Information: Validation Study in a Real Health Care Environment." JMIR Medical Informatics 9, no. 5 (2021): e25869. http://dx.doi.org/10.2196/25869.

Full text
Abstract:
Background Federated learning is a decentralized approach to machine learning; it is a training strategy that overcomes medical data privacy regulations and generalizes deep learning algorithms. Federated learning mitigates many systemic privacy risks by sharing only the model and parameters for training, without the need to export existing medical data sets. In this study, we performed ultrasound image analysis using federated learning to predict whether thyroid nodules were benign or malignant. Objective The goal of this study was to evaluate whether the performance of federated learning was
APA, Harvard, Vancouver, ISO, and other styles
23

Pushkar, Mehendale. "Privacy-Preserving AI Through Federated Learning." Journal of Scientific and Engineering Research 8, no. 3 (2021): 249–54. https://doi.org/10.5281/zenodo.12787499.

Full text
Abstract:
Federated Learning (FL) is revolutionizing the landscape of decentralized machine learning by enabling collaborative model training across multiple devices without the need to centralize data. This paper provides a comprehensive exploration of federated learning as a privacy-preserving technique in artificial intelligence (AI), examining critical challenges such as data security, communication efficiency, and inference attacks. This paper focuses on robust solutions including differential privacy, homomorphic encryption, and federated optimization to enhance the effectiveness of FL. Potential
APA, Harvard, Vancouver, ISO, and other styles
24

Shrivastava, Arpit. "Privacy-Centric AI: Navigating the Landscape with Federated Learning." International Journal for Research in Applied Science and Engineering Technology 12, no. 5 (2024): 357–63. http://dx.doi.org/10.22214/ijraset.2024.61000.

Full text
Abstract:
Abstract: In the era of big data and privacy concerns, federated learning has emerged as a promising approach to training machine learning models while preserving data privacy. This paper explores the principles and applications of federated learning, highlighting its potential to revolutionize privacy-centric AI. We discuss the methodology, significance, and challenges of federated learning, providing insights into its future directions. By leveraging decentralized data and aggregating model updates, federated learning enables the development of powerful AI models without compromising individ
APA, Harvard, Vancouver, ISO, and other styles
25

Kabat, Dr Subash Ranjan. "Federated Learning for Privacy-Preserving AI: A Comparative Analysis of Decentralized Data Training." International Journal of Machine Learning, AI & Data Science Evolution 1, no. 01 (2025): 9–21. https://doi.org/10.63665/ijmlaidse.v1i1.02.

Full text
Abstract:
The rapid adoption of Artificial Intelligence (AI) across industries, particularly in healthcare, finance, and smart devices, has introduced significant concerns regarding data privacy, security, and compliance with regulations such as GDPR, HIPAA, and CCPA. Traditional centralized machine learning (ML) models require large-scale data aggregation, increasing risks of data breaches, misuse, and unauthorized access. Federated Learning (FL) has emerged as a transformative solution, allowing multiple edge devices or organizations to collaboratively train machine learning models without sharing raw
APA, Harvard, Vancouver, ISO, and other styles
26

Zhou, Zhongchang, Fenggang Sun, Xiangyu Chen, Dongxu Zhang, Tianzhen Han, and Peng Lan. "A Decentralized Federated Learning Based on Node Selection and Knowledge Distillation." Mathematics 11, no. 14 (2023): 3162. http://dx.doi.org/10.3390/math11143162.

Full text
Abstract:
Federated learning has become increasingly important for modern machine learning, especially for data privacy sensitive scenarios. Existing federated learning mainly adopts a central server-based network topology, however, the training process of which is susceptible to the central node. To address this problem, this article proposed a decentralized federated learning method based on node selection and knowledge distillation. Specifically, the central node in this method is variable, and it is selected by the indicator interaction between nodes. Meanwhile, the knowledge distillation mechanism
APA, Harvard, Vancouver, ISO, and other styles
27

Parekh, Nisha Harish, and Mrs Vrushali Shinde. "Federated Learning : A Paradigm Shift in Collaborative Machine Learning." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 11 (2024): 1–6. http://dx.doi.org/10.55041/ijsrem38501.

Full text
Abstract:
Federated learning (FL) has emerged as an exceptionally promising method within the realm of machine learning, enabling multiple entities to jointly train a global model while maintaining decentralized data. This paper presents a comprehensive review of federated learning methodologies, applications, and challenges. We begin by elucidating the fundamental concepts underlying FL, including federated optimization algorithms, communication protocols, and privacy-preserving techniques. Subsequently, we delve into various domains where FL has found significant traction, examples include healthcare,
APA, Harvard, Vancouver, ISO, and other styles
28

Mishra, Snehlata, and Dr Ritu Tandon. "Federated Learning in Healthcare: A Path Towards Decentralized and Secure Medical Insights." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 10 (2024): 1–15. http://dx.doi.org/10.55041/ijsrem37791.

Full text
Abstract:
The rise of artificial intelligence (AI) in healthcare has created opportunities for advanced predictive models and personalized treatments, yet the sensitive nature of medical data presents significant challenges in terms of privacy, security, and regulatory compliance. Federated Learning (FL) has emerged as a promising solution to these issues, enabling decentralized machine learning across distributed datasets while preserving data privacy. This paper explores the application of FL in the healthcare domain, highlighting its potential to unlock valuable medical insights without the need for
APA, Harvard, Vancouver, ISO, and other styles
29

Zhang, Ticao, and Shiwen Mao. "An Introduction to the Federated Learning Standard." GetMobile: Mobile Computing and Communications 25, no. 3 (2022): 18–22. http://dx.doi.org/10.1145/3511285.3511291.

Full text
Abstract:
With the growing concern on data privacy and security, it is undesirable to collect data from all users to perform machine learning tasks. Federated learning, a decentralized learning framework, was proposed to construct a shared prediction model while keeping owners' data on their own devices. This paper presents an introduction to the emerging federated learning standard and discusses its various aspects, including i) an overview of federated learning, ii) types of federated learning, iii) major concerns and the performance evaluation criteria of federated learning, and iv) associated regula
APA, Harvard, Vancouver, ISO, and other styles
30

Sharma, Saurabh, Zohaib Hasan2, and Vishal Paranjape. "Optimizing Federated Learning Techniques for Advanced Decentralized AI Systems." International Journal of Innovative Research in Computer and Communication Engineering 10, no. 08 (2023): 7721–29. http://dx.doi.org/10.15680/ijircce.2022.1008042.

Full text
Abstract:
Machine learning (ML) models have become indispensable in extracting insights and promoting innovation across diverse fields due to the emergence of large data. Historically, the creation of ML models has relied on gathering data in a centralized manner, which has raised notable concerns around privacy and security. This is because sensitive data needs to be transferred to a central place. Decentralized machine learning resolves these concerns by allowing the training of machine learning models on numerous devices without centralizing the data. This study explores the intricacies of decentrali
APA, Harvard, Vancouver, ISO, and other styles
31

Sukender Reddy Mallreddy. "ENHANCING CLOUD DATA PRIVACY THROUGH FEDERATED LEARNING: A DECENTRALIZED APPROACH TO AI MODEL TRAINING." IJRDO -Journal of Computer Science Engineering 9, no. 8 (2023): 15–22. http://dx.doi.org/10.53555/cse.v9i8.6131.

Full text
Abstract:
The federated learning model on cloud platforms adjusts the training of the artificial intelligence models, shifting focus on data security while retaining the previously used formula. Traditional centralized approaches towards training AI models are insecure and unsafe for data and privacy because of the vulnerability of exposing data in a cloud setting. Federated learning helps to train the ML models with the assistance of numerous edge devices or servers without gaining access to data in a central server. The concept describing one of the promising ways to learn on big data without transmit
APA, Harvard, Vancouver, ISO, and other styles
32

Niu, Yifan, and Weihong Deng. "Federated Learning for Face Recognition with Gradient Correction." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 2 (2022): 1999–2007. http://dx.doi.org/10.1609/aaai.v36i2.20095.

Full text
Abstract:
With increasing appealing to privacy issues in face recognition, federated learning has emerged as one of the most prevalent approaches to study the unconstrained face recognition problem with private decentralized data. However, conventional decentralized federated algorithm sharing whole parameters of networks among clients suffers from privacy leakage in face recognition scene. In this work, we introduce a framework, FedGC, to tackle federated learning for face recognition and guarantees higher privacy. We explore a novel idea of correcting gradients from the perspective of backward propaga
APA, Harvard, Vancouver, ISO, and other styles
33

Lee, Jungjae, and Wooseong Kim. "DAG-Based Blockchain Sharding for Secure Federated Learning with Non-IID Data." Sensors 22, no. 21 (2022): 8263. http://dx.doi.org/10.3390/s22218263.

Full text
Abstract:
Federated learning is a type of privacy-preserving, collaborative machine learning. Instead of sharing raw data, the federated learning process cooperatively exchanges the model parameters and aggregates them in a decentralized manner through multiple users. In this study, we designed and implemented a hierarchical blockchain system using a public blockchain for a federated learning process without a trusted curator. This prevents model-poisoning attacks and provides secure updates of a global model. We conducted a comprehensive empirical study to characterize the performance of federated lear
APA, Harvard, Vancouver, ISO, and other styles
34

Gawalkar, Shraddha. ""Federated Learning for Privacy-Preserving AI in Edge and IoT Devices"." International Scientific Journal of Engineering and Management 04, no. 05 (2025): 1–7. https://doi.org/10.55041/isjem03376.

Full text
Abstract:
Abstract - This paper presents a study on the application of federated learning (FL) frameworks to enhance privacy-preserving artificial intelligence in edge and Internet of Things (IoT) environments. The core objective is to demonstrate how decentralized machine learning models can be trained collaboratively across distributed devices without sharing raw data, thus addressing major privacy and data security concerns. The research explores key architectural models of federated learning, evaluates optimization algorithms for efficient communication, and integrates secure aggregation techniques
APA, Harvard, Vancouver, ISO, and other styles
35

Bharathi, M., T. Aditya Sai Srinivas, and M. Bhuvaneswari. "Federated Learning: From Origins to Modern Applications and Challenges." Journal of Information Technology and Cryptography (e-ISSN: 3048-5290) 1, no. 2 (2024): 29–38. http://dx.doi.org/10.48001/joitc.2024.1229-38.

Full text
Abstract:
Federated learning is an innovative machine learning approach that allows models to be trained collaboratively across decentralized data sources, all while keeping sensitive information where it belongs on local devices. This method has gained significant attention in recent years, primarily because it offers a way to address growing concerns around data privacy and security. Instead of collecting data in a central location, federated learning enables different entities, like hospitals or financial institutions, to work together on model training without ever sharing their raw data. This makes
APA, Harvard, Vancouver, ISO, and other styles
36

Adeyinka Ogunbajo, Itunu Taiwo, Adefemi Quddus Abidola, Oluwadamilola Fisayo Adediran, and Israel Agbo-Adediran. "Privacy preserving AI models for decentralized data management in federated information systems." GSC Advanced Research and Reviews 22, no. 2 (2025): 104–12. https://doi.org/10.30574/gscarr.2025.22.2.0043.

Full text
Abstract:
Federated information systems represent a transformative approach to decentralized data management and privacy-preserving artificial intelligence. This review critically examines the architectural innovations, technological challenges, and emerging paradigms in federated learning and distributed computing environments. By enabling collaborative model training across disparate data sources without direct data sharing, these systems address critical privacy concerns while maintaining computational efficiency. The research synthesizes current implementation strategies across domains such as healt
APA, Harvard, Vancouver, ISO, and other styles
37

SATO, Koya. "Decentralized Federated Learning over Wireless Channels: A Review." IEICE ESS Fundamentals Review 16, no. 1 (2022): 7–16. http://dx.doi.org/10.1587/essfr.16.1_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Qi, Minfeng, Ziyuan Wang, Shiping Chen, and Yang Xiang. "A Hybrid Incentive Mechanism for Decentralized Federated Learning." Distributed Ledger Technologies: Research and Practice 1, no. 1 (2022): 1–15. http://dx.doi.org/10.1145/3538226.

Full text
Abstract:
Federated Learning (FL) presents a privacy-compliant approach by sharing model parameters instead of raw data. However, how to motivate data owners to participate in and stay within an FL ecosystem by continuously contributing their data to the FL model remains a challenge. In this article, we propose a hybrid incentive mechanism based on blockchain to address the above challenge. The proposed mechanism comprises two primary smart contract-based modules, namely the reputation module and the reverse auction module. The former is used to dynamically calculate the reputation score of each FL part
APA, Harvard, Vancouver, ISO, and other styles
39

Kuo, Tsung-Ting, and Anh Pham. "Detecting model misconducts in decentralized healthcare federated learning." International Journal of Medical Informatics 158 (February 2022): 104658. http://dx.doi.org/10.1016/j.ijmedinf.2021.104658.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Liu, Wei, Li Chen, and Wenyi Zhang. "Decentralized Federated Learning: Balancing Communication and Computing Costs." IEEE Transactions on Signal and Information Processing over Networks 8 (2022): 131–43. http://dx.doi.org/10.1109/tsipn.2022.3151242.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Ma, Xuyang, Du Xu, and Katinka Wolter. "Towards blockchain-enabled decentralized and secure federated learning." Information Sciences 665 (April 2024): 120368. http://dx.doi.org/10.1016/j.ins.2024.120368.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Balija, Sree Bhargavi, Amitash Nanda, and Debashis Sahoo. "Building Communication Efficient Asynchronous Peer-to-Peer Federated LLMs with Blockchain." Proceedings of the AAAI Symposium Series 3, no. 1 (2024): 288–92. http://dx.doi.org/10.1609/aaaiss.v3i1.31212.

Full text
Abstract:
Large language models (LLM) have gathered attention with the advent of ChatGPT. However, developing personalized LLM models faces challenges in real-world applications due to data scarcity and privacy concerns. Federated learning addresses these issues, providing collaborative training while preserving the client’s data. Although it has made significant progress, federated learning still faces ongoing challenges, such as communication efficiency, heterogeneous data, and privacy-preserving methods. This paper presents a novel, fully decentralized federated learning framework for LLMs to address
APA, Harvard, Vancouver, ISO, and other styles
43

Vijayalaxmi Methuku. "Decentralized machine learning for disease outbreak prediction: Enhancing data privacy with federated learning." International Journal of Science and Research Archive 14, no. 3 (2025): 001–8. https://doi.org/10.30574/ijsra.2025.14.3.0590.

Full text
Abstract:
The ability to predict and contain disease outbreaks is essential for global public health. However, traditional machine learning models for epidemiological forecasting relieve centralized data aggregation, which poses significant privacy risks and regulatory challenges. In this study, we propose a federated learning (FL)-based decentralized framework that enables collaborative model training across multiple healthcare institutions without exposing sensitive patient data. By leveraging privacy-preserving techniques such as secure aggregation and differential privacy, our approach ensures data
APA, Harvard, Vancouver, ISO, and other styles
44

Oise, Godfrey Perfectson, Chioma Julia Onwuzo, Mary Fole, et al. "DECENTRALIZED DEEP LEARNING IN HEALTHCARE: ADDRESSING DATA PRIVACY WITH FEDERATED LEARNING." FUDMA JOURNAL OF SCIENCES 9, no. 6 (2025): 19–26. https://doi.org/10.33003/fjs-2025-0906-3714.

Full text
Abstract:
This study presents a privacy-preserving federated learning framework combining recurrent neural networks for healthcare applications, balancing data privacy with clinical utility. The decentralized system enables multi-institutional collaboration without centralized data collection, complying with HIPAA/GDPR through two technical safeguards: differential privacy via DP-SGD during local training and secure aggregation of model updates. Using LSTM/GRU architectures optimized for sequential medical data, the framework achieves an F1 Score of 67% with precision (60%) and recall (75%) suitable for
APA, Harvard, Vancouver, ISO, and other styles
45

Kejriwal, Deepak Kumar, Anshul Goel, Tejakumar Dattatray Pujari, and Anil Kumar Pakina. "Adversarial AI in Federated Learning: Threats, Robust Defenses, and the Role of Explain ability for Trustworthy Distributed AI." IOSR Journal of Computer Engineering 25, no. 6 (2023): 32–43. https://doi.org/10.9790/0661-2506033243.

Full text
Abstract:
Federated Learning (FL) is considered to be a suitable mechanism for privacy-preserving and distributed machine learning. While preserving decentralized data, it simultaneously protects global parameter updates. However, with all this advantage, FL reduces many securityprovisions thus opening a gateway for adversary AI. Here the adversary can manipulate local model updates or poison decentralized training data, hijacking the system. Some challenges that comeupincludemodelpoisoning,backdoorattacks,andgradientinversion,whichwouldgreatly compromise the reliability, privacy, and trustworthiness of
APA, Harvard, Vancouver, ISO, and other styles
46

K. Usha Rani, Sreenivasulu Reddy L., Yaswanth Kumar Alapati, M. Katyayani, Kumar Keshamoni, A. Sree Rama Chandra Murthy,. ""Federated Learning: Advancements, Applications, and Future Directions for Collaborative Machine Learning in Distributed Environments"." Journal of Electrical Systems 20, no. 5s (2024): 165–71. http://dx.doi.org/10.52783/jes.1900.

Full text
Abstract:
Federated Learning (FL) has become widely recognized as a feasible method for training machine learning models on decentralized devices, ensuring the preservation of data privacy. This study offers an extensive overview of the latest progress in federated learning methods, their applications, and the challenges they entail. We begin by introducing the concept of federated learning and its significance in distributed environments. Next, we delve into a range of methodologies aimed at improving the effectiveness, scalability, and confidentiality of federated learning. These encompass optimizatio
APA, Harvard, Vancouver, ISO, and other styles
47

Kundu, Subhasis. "Multi-Brain Federated Learning for Decentralized AI: Collaborative, Privacy-Preserving Models Across Domains." INTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 06 (2025): 1–8. https://doi.org/10.55041/ijsrem19060.

Full text
Abstract:
Multi-brain Federated Learning (MBFL) introduces an innovative approach to decentralized artificial intelligence, enabling joint model training across various fields while maintaining data privacy. This study clarifies the MBFL concept and explores its potential uses in industries such as healthcare, finance, and defense. It covers the core principles of MBFL such as data decentralization, model aggregation, and privacy-preserving techniques. The benefits of MBFL, including improved model performance and reduction of data silos, are examined along with possible challenges and limitations. A fr
APA, Harvard, Vancouver, ISO, and other styles
48

Gao, Yuan. "Federated learning: Impact of different algorithms and models on prediction results based on fashion-MNIST data set." Applied and Computational Engineering 86, no. 1 (2024): 210–18. http://dx.doi.org/10.54254/2755-2721/86/20241594.

Full text
Abstract:
The realm of federated learning is rapidly advancing amid the era of big data. Therefore, how to select a suitable federated learning algorithm to achieve realistic tasks has become particularly critical. In this study, we explore the impact of different algorithms and models on the prediction results of Federated Learning (FL) using the Fashion-MNIST data set. Federated Learning enhances data privacy and reduces latency by training models directly on local devices since it is a decentralized machine learning approach. We analyze the performance of several FL algorithms including Federated Ave
APA, Harvard, Vancouver, ISO, and other styles
49

Yadav, Prajakta, Mr Pratik Yalameli, Mr Monish Parulekar, Mr Akarshan Shukla, Chanchal Singhal, and Khushboo Singh. "Federated Learning for Edge Intelligence." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 04 (2025): 1–9. https://doi.org/10.55041/ijsrem43629.

Full text
Abstract:
This research investigates the comparative performance of federated and centralized learning models for bird image classification across multiple training rounds. We implement a complete federated learning system using TensorFlow and Flower framework, with a MobileNetV2-based architecture capable of classifying five categories (bluetit, jackdaw, robin, unknown_bird, unknown_object). Our system demonstrates that federated learning achieves 92.3% accuracy compared to 94.1% in centralized mode, with the added benefit of data privacy preservation. The implementation includes a web-based interface
APA, Harvard, Vancouver, ISO, and other styles
50

Li, Qinglun, Miao Zhang, Nan Yin, Quanjun Yin, Li Shen, and Xiaochun Cao. "Asymmetrically Decentralized Federated Learning." IEEE Transactions on Computers, 2025, 1–12. https://doi.org/10.1109/tc.2025.3569185.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!