To see the other types of publications on this topic, follow the link: Frameworks of FL.

Journal articles on the topic 'Frameworks of FL'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Frameworks of FL.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Byrnes,, Heidi. "Of frameworks and the goals of collegiate foreign language education: critical reflections." Applied Linguistics Review 3, no. 1 (2012): 1–24. http://dx.doi.org/10.1515/applirev-2012-0001.

Full text
Abstract:
AbstractThe paper suggests that among reasons for the difficulties collegiate foreign language (FL) programs in the United States (and most likely elsewhere) encounter in assuring that their students attain the kind of upper-level multiple literacies necessary for engaging in sophisticated work with FL oral and written texts may be the fact that prevailing frameworks for capturing FL performance, development, and assessment are insufficient for envisioning such textually oriented learning goals. The result of this mismatch between dominant frameworks, typically associated with communicative la
APA, Harvard, Vancouver, ISO, and other styles
2

I., Venkata Dwaraka Srihith. "Federated Frameworks: Pioneering Secure and Decentralized Authentication Systems." Journal of Advances in Computational Intelligence Theory 7, no. 1 (2024): 31–40. https://doi.org/10.5281/zenodo.13968684.

Full text
Abstract:
<em>Federated Learning (FL) is innovative machine learning approach that lets multiple devices work together to train models without sharing sensitive data. By keeping data on the device, FL not only boosts privacy and security but also helps improve models collectively. Recent research looked into how Blockchain technology could strengthen FL, tackling existing security issues. Blockchain adds a safeguard against threats like data tampering or unauthorized access and makes systems more transparent and fairer by improving how records and rewards are managed. By blending Blockchain with FL, we
APA, Harvard, Vancouver, ISO, and other styles
3

Rajendran, Suraj, Zhenxing Xu, Weishen Pan, Arnab Ghosh, and Fei Wang. "Data heterogeneity in federated learning with Electronic Health Records: Case studies of risk prediction for acute kidney injury and sepsis diseases in critical care." PLOS Digital Health 2, no. 3 (2023): e0000117. http://dx.doi.org/10.1371/journal.pdig.0000117.

Full text
Abstract:
With the wider availability of healthcare data such as Electronic Health Records (EHR), more and more data-driven based approaches have been proposed to improve the quality-of-care delivery. Predictive modeling, which aims at building computational models for predicting clinical risk, is a popular research topic in healthcare analytics. However, concerns about privacy of healthcare data may hinder the development of effective predictive models that are generalizable because this often requires rich diverse data from multiple clinical institutions. Recently, federated learning (FL) has demonstr
APA, Harvard, Vancouver, ISO, and other styles
4

Gufran, Danish, and Sudeep Pasricha. "FedHIL: Heterogeneity Resilient Federated Learning for Robust Indoor Localization with Mobile Devices." ACM Transactions on Embedded Computing Systems 22, no. 5s (2023): 1–24. http://dx.doi.org/10.1145/3607919.

Full text
Abstract:
Indoor localization plays a vital role in applications such as emergency response, warehouse management, and augmented reality experiences. By deploying machine learning (ML) based indoor localization frameworks on their mobile devices, users can localize themselves in a variety of indoor and subterranean environments. However, achieving accurate indoor localization can be challenging due to heterogeneity in the hardware and software stacks of mobile devices, which can result in inconsistent and inaccurate location estimates. Traditional ML models also heavily rely on initial training data, ma
APA, Harvard, Vancouver, ISO, and other styles
5

Mar’i, Farhanna, and Ahmad Afif Supianto. "A conceptual approach of optimization in federated learning." Indonesian Journal of Electrical Engineering and Computer Science 37, no. 1 (2025): 288. http://dx.doi.org/10.11591/ijeecs.v37.i1.pp288-299.

Full text
Abstract:
Federated learning (FL) is an emerging approach to distributed learning from decentralized data, designed with privacy concerns in mind. FL has been successfully applied in several fields, such as the internet of things (IoT), human activity recognition (HAR), and natural language processing (NLP), showing remarkable results. However, the development of FL in real-world applications still faces several challenges. Recent optimizations of FL have been made to address these issues and enhance the FL settings. In this paper, we categorize the optimization of FL into five main challenges: Communic
APA, Harvard, Vancouver, ISO, and other styles
6

Farhanna, Mar'i Ahmad Afif Supianto. "A conceptual approach of optimization in federated learning." Indonesian Journal of Electrical Engineering and Computer Science 37, no. 1 (2025): 288–99. https://doi.org/10.11591/ijeecs.v37.i1.pp288-299.

Full text
Abstract:
Federated learning (FL) is an emerging approach to distributed learning from decentralized data, designed with privacy concerns in mind. FL has been successfully applied in several fields, such as the internet of things (IoT), human activity recognition (HAR), and natural language processing (NLP), showing remarkable results. However, the development of FL in real-world applications still faces several challenges. Recent optimizations of FL have been made to address these issues and enhance the FL settings. In this paper, we categorize the optimization of FL into five main challenges: Communic
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Hanjing. "The practical applications of federated learning across various domains." Applied and Computational Engineering 87, no. 1 (2024): 154–61. http://dx.doi.org/10.54254/2755-2721/87/20241582.

Full text
Abstract:
With the advancement of artificial intelligence technology, a vast amount of data is transmitted during the model training process, significantly increasing the risk of data leakage. In an era where data privacy is highly valued, protecting data from leakage has become an urgent issue. Federated Learning (FL) has thus been proposed and applied across various fields. This paper presents the applications of FL in five key areas: healthcare, urban transportation, computer vision, Industrial Internet of Things (IIoT), and 5G networks. This paper discusses the feasibility of implementing FL for pri
APA, Harvard, Vancouver, ISO, and other styles
8

Luay Bahjat Albtosh. "Harnessing the power of federated learning to advance technology." World Journal of Advanced Research and Reviews 23, no. 3 (2024): 1302–12. http://dx.doi.org/10.30574/wjarr.2024.23.3.2768.

Full text
Abstract:
Federated Learning (FL) has emerged as a transformative paradigm in machine learning, advocating for decentralized, privacy-preserving model training. This study provides a comprehensive evaluation of contemporary FL frameworks – TensorFlow Federated (TFF), PySyft, and FedJAX – across three diverse datasets: CIFAR-10, IMDb reviews, and the UCI Heart Disease dataset. Our results demonstrate TFF's superior performance on image classification tasks, while PySyft excels in both efficiency and privacy for textual data. The study underscores the potential of FL in ensuring data privacy and model per
APA, Harvard, Vancouver, ISO, and other styles
9

Chia, Harmon Lee Bruce. "Harnessing the power of federated learning to advance technology." Advances in Engineering Innovation 2, no. 1 (2023): 44–47. http://dx.doi.org/10.54254/2977-3903/2/2023020.

Full text
Abstract:
Federated Learning (FL) has emerged as a transformative paradigm in machine learning, advocating for decentralized, privacy-preserving model training. This study provides a comprehensive evaluation of contemporary FL frameworks TensorFlow Federated (TFF), PySyft, and FedJAX across three diverse datasets: CIFAR-10, IMDb reviews, and the UCI Heart Disease dataset. Our results demonstrate TFF's superior performance on image classification tasks, while PySyft excels in both efficiency and privacy for textual data. The study underscores the potential of FL in ensuring data privacy and model perform
APA, Harvard, Vancouver, ISO, and other styles
10

Luay, Bahjat Albtosh. "Harnessing the power of federated learning to advance technology." World Journal of Advanced Research and Reviews 23, no. 3 (2024): 1303–12. https://doi.org/10.5281/zenodo.14945174.

Full text
Abstract:
Federated Learning (FL) has emerged as a transformative paradigm in machine learning, advocating for decentralized, privacy-preserving model training. This study provides a comprehensive evaluation of contemporary FL frameworks &ndash; TensorFlow Federated (TFF), PySyft, and FedJAX &ndash; across three diverse datasets: CIFAR-10, IMDb reviews, and the UCI Heart Disease dataset. Our results demonstrate TFF's superior performance on image classification tasks, while PySyft excels in both efficiency and privacy for textual data. The study underscores the potential of FL in ensuring data privacy a
APA, Harvard, Vancouver, ISO, and other styles
11

Ngoupayou Limbepe, Zounkaraneni, Keke Gai, and Jing Yu. "Blockchain-Based Privacy-Enhancing Federated Learning in Smart Healthcare: A Survey." Blockchains 3, no. 1 (2025): 1. https://doi.org/10.3390/blockchains3010001.

Full text
Abstract:
Federated learning (FL) has emerged as an efficient machine learning (ML) method with crucial privacy protection features. It is adapted for training models in Internet of Things (IoT)-related domains, including smart healthcare systems (SHSs), where the introduction of IoT devices and technologies can arise various security and privacy concerns. However, as FL cannot solely address all privacy challenges, privacy-enhancing technologies (PETs) and blockchain are often integrated to enhance privacy protection in FL frameworks within SHSs. The critical questions remain regarding how these techno
APA, Harvard, Vancouver, ISO, and other styles
12

Yan, Lei, Lei Wang, Guanjun Li, Jingwei Shao, and Zhixin Xia. "Secure Dynamic Scheduling for Federated Learning in Underwater Wireless IoT Networks." Journal of Marine Science and Engineering 12, no. 9 (2024): 1656. http://dx.doi.org/10.3390/jmse12091656.

Full text
Abstract:
Federated learning (FL) is a distributed machine learning approach that can enable Internet of Things (IoT) edge devices to collaboratively learn a machine learning model without explicitly sharing local data in order to achieve data clustering, prediction, and classification in networks. In previous works, some online multi-armed bandit (MAB)-based FL frameworks were proposed to enable dynamic client scheduling for improving the efficiency of FL in underwater wireless IoT networks. However, the security of online dynamic scheduling, which is especially essential for underwater wireless IoT, i
APA, Harvard, Vancouver, ISO, and other styles
13

Kholod, Ivan, Evgeny Yanaki, Dmitry Fomichev, et al. "Open-Source Federated Learning Frameworks for IoT: A Comparative Review and Analysis." Sensors 21, no. 1 (2020): 167. http://dx.doi.org/10.3390/s21010167.

Full text
Abstract:
The rapid development of Internet of Things (IoT) systems has led to the problem of managing and analyzing the large volumes of data that they generate. Traditional approaches that involve collection of data from IoT devices into one centralized repository for further analysis are not always applicable due to the large amount of collected data, the use of communication channels with limited bandwidth, security and privacy requirements, etc. Federated learning (FL) is an emerging approach that allows one to analyze data directly on data sources and to federate the results of each analysis to yi
APA, Harvard, Vancouver, ISO, and other styles
14

Nguen, Thi Ha Mi. "Application of COSO and COBIT Frameworks for the Purpose of Organization of Internal Control." Auditor 7, no. 5 (2021): 15–23. http://dx.doi.org/10.12737/1998-0701-2021-7-5-15-23.

Full text
Abstract:
This article analyzes the relevance of this framework for the purposes of organizing internal control and proposes a scheme for integrating COBIT and COSO frameworks: the theses of these two frameworks are interconnected through the concepts of «process» and «information fl ows». The article also considers the application of the integrated concept in the context of the implementation of the principles of sustainable development in the activities of the organization.
APA, Harvard, Vancouver, ISO, and other styles
15

Novikova, Evgenia, Dmitry Fomichov, Ivan Kholod, and Evgeny Filippov. "Analysis of Privacy-Enhancing Technologies in Open-Source Federated Learning Frameworks for Driver Activity Recognition." Sensors 22, no. 8 (2022): 2983. http://dx.doi.org/10.3390/s22082983.

Full text
Abstract:
Wearable devices and smartphones that are used to monitor the activity and the state of the driver collect a lot of sensitive data such as audio, video, location and even health data. The analysis and processing of such data require observing the strict legal requirements for personal data security and privacy. The federated learning (FL) computation paradigm has been proposed as a privacy-preserving computational model that allows securing the privacy of the data owner. However, it still has no formal proof of privacy guarantees, and recent research showed that the attacks targeted both the m
APA, Harvard, Vancouver, ISO, and other styles
16

Matschinske, Julian, Julian Späth, Mohammad Bakhtiari, et al. "The FeatureCloud Platform for Federated Learning in Biomedicine: Unified Approach." Journal of Medical Internet Research 25 (July 12, 2023): e42621. http://dx.doi.org/10.2196/42621.

Full text
Abstract:
Background Machine learning and artificial intelligence have shown promising results in many areas and are driven by the increasing amount of available data. However, these data are often distributed across different institutions and cannot be easily shared owing to strict privacy regulations. Federated learning (FL) allows the training of distributed machine learning models without sharing sensitive data. In addition, the implementation is time-consuming and requires advanced programming skills and complex technical infrastructures. Objective Various tools and frameworks have been developed t
APA, Harvard, Vancouver, ISO, and other styles
17

Dalimarta, Fahmy Ferdian, Nina Faoziyah, and Doni Setiawan. "A Novel Privacy-Preserving Algorithm for Secure Data Sharing in Federated Learning Frameworks." Journal of Computer Networks, Architecture and High Performance Computing 7, no. 1 (2025): 223–34. https://doi.org/10.47709/cnahpc.v7i1.5385.

Full text
Abstract:
Federated Learning (FL) has emerged as a promising paradigm for the collaborative training of machine learning models across decentralized devices while preserving data privacy. However, ensuring data security and privacy during model updates remains a critical challenge, particularly in scenarios that involve sensitive data. This study proposes a novel Privacy-Preserving Algorithm (PPA-FL) designed to enhance data security and mitigate privacy leakage risks in FL frameworks. The algorithm integrates advanced encryption techniques, such as homomorphic encryption, with differential privacy to s
APA, Harvard, Vancouver, ISO, and other styles
18

R., Sanjana, Nikesh M., Bhuvaneshwari M., Bharathi M., and Aditya Sai Srinivas T. "United Intelligence: Federated Learning for the Future of Technology." Research and Applications of Web Development and Design 8, no. 1 (2024): 1–7. https://doi.org/10.5281/zenodo.13933081.

Full text
Abstract:
<em>Federated Learning (FL) is rapidly transforming how we approach machine learning by offering a decentralized, privacy-first way to train models. Instead of sending data to a central server, FL enables devices to collaborate and learn without ever sharing sensitive information, making it a game-changer for privacy-conscious applications. In this study, we dive deep into three leading FL frameworks&mdash;TensorFlow Federated (TFF), PySyft, and FedJAX&mdash;testing them on datasets like CIFAR-10 for image classification, IMDb reviews for sentiment analysis, and the UCI Heart Disease dataset f
APA, Harvard, Vancouver, ISO, and other styles
19

Wu, Lang, Weijian Ruan, Jinhui Hu, and Yaobin He. "A Survey on Blockchain-Based Federated Learning." Future Internet 15, no. 12 (2023): 400. http://dx.doi.org/10.3390/fi15120400.

Full text
Abstract:
Federated learning (FL) and blockchains exhibit significant commonality, complementarity, and alignment in various aspects, such as application domains, architectural features, and privacy protection mechanisms. In recent years, there have been notable advancements in combining these two technologies, particularly in data privacy protection, data sharing incentives, and computational performance. Although there are some surveys on blockchain-based federated learning (BFL), these surveys predominantly focus on the BFL framework and its classifications, yet lack in-depth analyses of the pivotal
APA, Harvard, Vancouver, ISO, and other styles
20

Du, Weidong, Min Li, Yiliang Han, Xu An Wang, and Zhaoying Wei. "A Homomorphic Signcryption-Based Privacy Preserving Federated Learning Framework for IoTs." Security and Communication Networks 2022 (September 22, 2022): 1–10. http://dx.doi.org/10.1155/2022/8380239.

Full text
Abstract:
Federated learning (FL) enables clients to train a machine learning model collaboratively by just aggregating their model parameters, which makes it very useful in empowering the IoTs with intelligence. To prevent privacy information leakage from parameters during aggregation, many FL frameworks use homomorphic encryption to protect client’s parameters. However, a secure federated learning framework should not only protect privacy of the parameters but also guarantee integrity of the aggregated results. In this paper, we propose an efficient homomorphic signcryption framework that can encrypt
APA, Harvard, Vancouver, ISO, and other styles
21

Aishwarya, M., J. Umesh Chandra, M. Farhan Ali, M. Bharathi, and T. Aditya Sai Srinivas. "Secure and Scalable AI: Insights into Federated Learning Algorithms and Platforms." Journal of Communication Engineering and VLSI Design 2, no. 2 (2024): 15–27. http://dx.doi.org/10.48001/jocevd.2024.2215-27.

Full text
Abstract:
This paper takes a closer look at the rapidly advancing field of Federated Learning (FL), a decentralized machine learning approach that focuses on preserving data privacy by training models across multiple devices. It highlights key algorithms like Federated Averaging (FedAvg) and its more refined versions Hierarchical Federated Averaging (HierFAVG) and Federated Matched Averaging (FedMA) which improve model aggregation techniques. The discussion extends to both Horizontal and Vertical Federated Learning (HFL and VFL), illustrating how they handle data partitioning and communication different
APA, Harvard, Vancouver, ISO, and other styles
22

Li, Youpeng, Xuyu Wang, and Lingling An. "Hierarchical Clustering-based Personalized Federated Learning for Robust and Fair Human Activity Recognition." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 7, no. 1 (2022): 1–38. http://dx.doi.org/10.1145/3580795.

Full text
Abstract:
Currently, federated learning (FL) can enable users to collaboratively train a global model while protecting the privacy of user data, which has been applied to human activity recognition (HAR) tasks. However, in real HAR scenarios, deploying an FL system needs to consider multiple aspects, including system accuracy, fairness, robustness, and scalability. Most existing FL frameworks aim to solve specific problems while ignoring other properties. In this paper, we propose FedCHAR, a personalized FL framework with a hierarchical clustering method for robust and fair HAR, which not only improves
APA, Harvard, Vancouver, ISO, and other styles
23

Jerkovic, Filip, Nurul I. Sarkar, and Jahan Ali. "Smart Grid IoT Framework for Predicting Energy Consumption Using Federated Learning Homomorphic Encryption." Sensors 25, no. 12 (2025): 3700. https://doi.org/10.3390/s25123700.

Full text
Abstract:
Homomorphic Encryption (HE) introduces new dimensions of security and privacy within federated learning (FL) and internet of things (IoT) frameworks that allow preservation of user privacy when handling data for FL occurring in Smart Grid (SG) technologies. In this paper, we propose a novel SG IoT framework to provide a solution for predicting energy consumption while preserving user privacy in a smart grid system. The proposed framework is based on the integration of FL, edge computing, and HE principles to provide a robust and secure framework to conduct machine learning workloads end-to-end
APA, Harvard, Vancouver, ISO, and other styles
24

Elshair, Ismail M., Tariq Jamil Saifullah Khanzada, Muhammad Farrukh Shahid, and Shahbaz Siddiqui. "Evaluating Federated Learning Simulators: A Comparative Analysis of Horizontal and Vertical Approaches." Sensors 24, no. 16 (2024): 5149. http://dx.doi.org/10.3390/s24165149.

Full text
Abstract:
Federated learning (FL) is a decentralized machine learning approach whereby each device is allowed to train local models, eliminating the requirement for centralized data collecting and ensuring data privacy. Unlike typical typical centralized machine learning, collaborative model training in FL involves aggregating updates from various devices without sending raw data. This ensures data privacy and security while collecting a collective learning from distributed data sources. These devices in FL models exhibit high efficacy in terms of privacy protection, scalability, and robustness, which i
APA, Harvard, Vancouver, ISO, and other styles
25

Pujari, Mangesh, and Anil Kumar Pakina. "EdgeAI for Privacy-Preserving AI: The Role of Small LLMs in Federated Learning Environments." International Journal of Engineering and Computer Science 13, no. 10 (2024): 26589–601. https://doi.org/10.18535/ijecs.v13i10.4889.

Full text
Abstract:
Privacy considerations in artificial intelligence (AI) have led to the popularization of federated learning (FL) as a decentralized training organization. On this basis, FL allows collaborative model training without requiring data exchange for private data use. The adoption of FL on edge devices faces major challenges due to limited computational resources, networks, and energy efficiency. This paper analyzes the operation of small language models (SLMs) in FL frameworks with an eye on their promise to let intelligent privacy-preserving architectures thrive on edge devices. It is through SLMs
APA, Harvard, Vancouver, ISO, and other styles
26

Li, Boyuan, Shengbo Chen, and Zihao Peng. "New Generation Federated Learning." Sensors 22, no. 21 (2022): 8475. http://dx.doi.org/10.3390/s22218475.

Full text
Abstract:
With the development of the Internet of things (IoT), federated learning (FL) has received increasing attention as a distributed machine learning (ML) framework that does not require data exchange. However, current FL frameworks follow an idealized setup in which the task size is fixed and the storage space is unlimited, which is impossible in the real world. In fact, new classes of these participating clients always emerge over time, and some samples are overwritten or discarded due to storage limitations. We urgently need a new framework to adapt to the dynamic task sequences and strict stor
APA, Harvard, Vancouver, ISO, and other styles
27

Dey, Subarna, Asamanjoy Bhunia, Dolores Esquivel, and Christoph Janiak. "Covalent triazine-based frameworks (CTFs) from triptycene and fluorene motifs for CO2 adsorption." Journal of Materials Chemistry A 4, no. 17 (2016): 6259–63. http://dx.doi.org/10.1039/c6ta00638h.

Full text
Abstract:
Two microporous CTFs with triptycene (TPC) and fluorene (FL) have been synthesized through a mild AlCl<sub>3</sub>-catalyzed Friedel–Crafts reaction, with the highest surface area of up to 1668 m<sup>2</sup> g<sup>−1</sup> for non-ionothermal CTFs. CTF-TPC and CTF-FL show an excellent carbon dioxide uptake capacity of up to 4.24 mmol g<sup>−1</sup> at 273 K and 1 bar.
APA, Harvard, Vancouver, ISO, and other styles
28

Jia, Yongzhe, Xuyun Zhang, Amin Beheshti, and Wanchun Dou. "FedLPS: Heterogeneous Federated Learning for Multiple Tasks with Local Parameter Sharing." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 11 (2024): 12848–56. http://dx.doi.org/10.1609/aaai.v38i11.29181.

Full text
Abstract:
Federated Learning (FL) has emerged as a promising solution in Edge Computing (EC) environments to process the proliferation of data generated by edge devices. By collaboratively optimizing the global machine learning models on distributed edge devices, FL circumvents the need for transmitting raw data and enhances user privacy. Despite practical successes, FL still confronts significant challenges including constrained edge device resources, multiple tasks deployment, and data heterogeneity. However, existing studies focus on mitigating the FL training costs of each single task whereas neglec
APA, Harvard, Vancouver, ISO, and other styles
29

Han, Bing, Qiang Fu, and Xinliang Zhang. "Towards Privacy-Preserving Federated Neuromorphic Learning via Spiking Neuron Models." Electronics 12, no. 18 (2023): 3984. http://dx.doi.org/10.3390/electronics12183984.

Full text
Abstract:
Federated learning (FL) has been broadly adopted in both academia and industry in recent years. As a bridge to connect the so-called “data islands”, FL has contributed greatly to promoting data utilization. In particular, FL enables disjoint entities to cooperatively train a shared model, while protecting each participant’s data privacy. However, current FL frameworks cannot offer privacy protection and reduce the computation overhead at the same time. Therefore, its implementation in practical scenarios, such as edge computing, is limited. In this paper, we propose a novel FL framework with s
APA, Harvard, Vancouver, ISO, and other styles
30

Sabuhi, Mikael, Petr Musilek, and Cor-Paul Bezemer. "Micro-FL: A Fault-Tolerant Scalable Microservice-Based Platform for Federated Learning." Future Internet 16, no. 3 (2024): 70. http://dx.doi.org/10.3390/fi16030070.

Full text
Abstract:
As the number of machine learning applications increases, growing concerns about data privacy expose the limitations of traditional cloud-based machine learning methods that rely on centralized data collection and processing. Federated learning emerges as a promising alternative, offering a novel approach to training machine learning models that safeguards data privacy. Federated learning facilitates collaborative model training across various entities. In this approach, each user trains models locally and shares only the local model parameters with a central server, which then generates a glo
APA, Harvard, Vancouver, ISO, and other styles
31

Zhang, Yanting, Jianwei Liu, Zhenyu Guan, Bihe Zhao, Xianglun Leng, and Song Bian. "ARMOR: Differential Model Distribution for Adversarially Robust Federated Learning." Electronics 12, no. 4 (2023): 842. http://dx.doi.org/10.3390/electronics12040842.

Full text
Abstract:
In this work, we formalize the concept of differential model robustness (DMR), a new property for ensuring model security in federated learning (FL) systems. For most conventional FL frameworks, all clients receive the same global model. If there exists a Byzantine client who maliciously generates adversarial samples against the global model, the attack will be immediately transferred to all other benign clients. To address the attack transferability concern and improve the DMR of FL systems, we propose the notion of differential model distribution (DMD) where the server distributes different
APA, Harvard, Vancouver, ISO, and other styles
32

Wu, Xueyang, Hengguan Huang, Youlong Ding, Hao Wang, Ye Wang, and Qian Xu. "FedNP: Towards Non-IID Federated Learning via Federated Neural Propagation." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 9 (2023): 10399–407. http://dx.doi.org/10.1609/aaai.v37i9.26237.

Full text
Abstract:
Traditional federated learning (FL) algorithms, such as FedAvg, fail to handle non-i.i.d data because they learn a global model by simply averaging biased local models that are trained on non-i.i.d local data, therefore failing to model the global data distribution. In this paper, we present a novel Bayesian FL algorithm that successfully handles such a non-i.i.d FL setting by enhancing the local training task with an auxiliary task that explicitly estimates the global data distribution. One key challenge in estimating the global data distribution is that the data are partitioned in FL, and th
APA, Harvard, Vancouver, ISO, and other styles
33

Ankit, Chauhan. "Decentralized AI Model Training and Inference Using Blockchain for Privacy- Preserving Federated Learning." Journal of Research and Innovation in Technology, Commerce and Management Vol. 2, Issue 6 (2025): 2654–61. https://doi.org/10.5281/zenodo.15606610.

Full text
Abstract:
Federated Learning (FL) has emerged as a promising approach to training machine learning models across distributed devices while preserving data privacy by avoiding centralized data collection. However, traditional FL frameworks rely on a central server to aggregate model updates, introducing vulnerabilities such as single-point failures, lack of transparency, and susceptibility to adversarial attacks like model poisoning. To address these challenges, this paper proposes a decentralized AI training and inference framework that integrates blockchain technology with FL to enhance security, priva
APA, Harvard, Vancouver, ISO, and other styles
34

Elmorsy, Esraa S., Ayman Mahrous, Wael A. Amer, and Mohamad M. Ayad. "Nitrogen-Doped Carbon Dots in Zeolitic Imidazolate Framework Core-Shell Nanocrystals: Synthesis and Characterization." Solid State Phenomena 336 (August 30, 2022): 81–87. http://dx.doi.org/10.4028/p-206xsy.

Full text
Abstract:
Metal-organic frameworks (MOFs) have exciting properties and promising applications in different fields. In this work, novel zeolitic imidazolate frameworks (ZIFs) have been synthesized by encapsulating N-doped carbon quantum dots (N-CDs) with a blue FL into the zeolitic imidazolate framework materials core-shell structure (ZIF-8@ZIF-67). The functionalized core-shell MOFs maintained their crystal structure, morphology, and enhanced UV-vis absorbance. The properties of these new composites exhibit excellent potential for different applications including sensing, photo-catalysis, and selective
APA, Harvard, Vancouver, ISO, and other styles
35

Shaheen, Momina, Muhammad Shoaib Farooq, Tariq Umer, and Byung-Seo Kim. "Applications of Federated Learning; Taxonomy, Challenges, and Research Trends." Electronics 11, no. 4 (2022): 670. http://dx.doi.org/10.3390/electronics11040670.

Full text
Abstract:
The federated learning technique (FL) supports the collaborative training of machine learning and deep learning models for edge network optimization. Although a complex edge network with heterogeneous devices having different constraints can affect its performance, this leads to a problem in this area. Therefore, some research can be seen to design new frameworks and approaches to improve federated learning processes. The purpose of this study is to provide an overview of the FL technique and its applicability in different domains. The key focus of the paper is to produce a systematic literatu
APA, Harvard, Vancouver, ISO, and other styles
36

Lin, Zu-Jin, Jian Lü, Maochun Hong, and Rong Cao. "Metal–organic frameworks based on flexible ligands (FL-MOFs): structures and applications." Chem. Soc. Rev. 43, no. 16 (2014): 5867–95. http://dx.doi.org/10.1039/c3cs60483g.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Liu, Zelei, Yuanyuan Chen, Yansong Zhao, et al. "Contribution-Aware Federated Learning for Smart Healthcare." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 11 (2022): 12396–404. http://dx.doi.org/10.1609/aaai.v36i11.21505.

Full text
Abstract:
Artificial intelligence (AI) is a promising technology to transform the healthcare industry. Due to the highly sensitive nature of patient data, federated learning (FL) is often leveraged to build models for smart healthcare applications. Existing deployed FL frameworks cannot address the key issues of varying data quality and heterogeneous data distributions across multiple institutions in this sector. In this paper, we report our experience developing and deploying the Contribution-Aware Federated Learning (CAFL) framework for smart healthcare. It provides an efficient and accurate approach
APA, Harvard, Vancouver, ISO, and other styles
38

Alhafiz, Fatimah, and Abdullah Basuhail. "The Data Heterogeneity Issue Regarding COVID-19 Lung Imaging in Federated Learning: An Experimental Study." Big Data and Cognitive Computing 9, no. 1 (2025): 11. https://doi.org/10.3390/bdcc9010011.

Full text
Abstract:
Federated learning (FL) has emerged as a transformative framework for collaborative learning, offering robust model training across institutions while ensuring data privacy. In the context of making a COVID-19 diagnosis using lung imaging, FL enables institutions to collaboratively train a global model without sharing sensitive patient data. A central manager aggregates local model updates to compute global updates, ensuring secure and effective integration. The global model’s generalization capability is evaluated using centralized testing data before dissemination to participating nodes, whe
APA, Harvard, Vancouver, ISO, and other styles
39

Guo, Jinnan, Peter Pietzuch, Andrew Paverd, and Kapil Vaswani. "Trustworthy AI using Confidential Federated Learning." Queue 22, no. 2 (2024): 87–107. http://dx.doi.org/10.1145/3665220.

Full text
Abstract:
The principles of security, privacy, accountability, transparency, and fairness are the cornerstones of modern AI regulations. Classic FL was designed with a strong emphasis on security and privacy, at the cost of transparency and accountability. CFL addresses this gap with a careful combination of FL with TEEs and commitments. In addition, CFL brings other desirable security properties, such as code-based access control, model confidentiality, and protection of models during inference. Recent advances in confidential computing such as confidential containers and confidential GPUs mean that ex
APA, Harvard, Vancouver, ISO, and other styles
40

Méndez Prado, Silvia Mariela, Marlon José Zambrano Franco, Susana Gabriela Zambrano Zapata, Katherine Malena Chiluiza García, Patricia Everaert, and Martin Valcke. "A Systematic Review of Financial Literacy Research in Latin America and The Caribbean." Sustainability 14, no. 7 (2022): 3814. http://dx.doi.org/10.3390/su14073814.

Full text
Abstract:
Several well-known studies have remarked on the low financial literacy (FL) levels in Latin America and the Caribbean (LAC), which represent a problem in an economic context of change and uncertainty. This fact gives us the opportunity to evaluate the current state of literature related to FL in the region. The main list of identified keywords allowed the PRISMA methodology to guide the systematic literature review and analysis procedure. During 2016–2022, the FL search yielded around 4500 FL manuscripts worldwide, but only 65 articles were related to the scope of our analysis (which involved
APA, Harvard, Vancouver, ISO, and other styles
41

Usman, Muhammad, Mario Luca Bernardi, and Marta Cimitile. "Introducing a Quality-Driven Approach for Federated Learning." Sensors 25, no. 10 (2025): 3083. https://doi.org/10.3390/s25103083.

Full text
Abstract:
The advancement of pervasive systems has made distributed real-world data across multiple devices increasingly valuable for training machine learning models. Traditional centralized learning approaches face limitations such as data security concerns and computational constraints. Federated learning (FL) provides privacy benefits but is hindered by challenges like data heterogeneity (Non-IID distributions) and noise heterogeneity (mislabeling and inconsistencies in local datasets), which degrade model performance. This paper proposes a model-agnostic, quality-driven approach, called DQFed, for
APA, Harvard, Vancouver, ISO, and other styles
42

Naseh, David, Mahdi Abdollahpour, and Daniele Tarchi. "Real-World Implementation and Performance Analysis of Distributed Learning Frameworks for 6G IoT Applications." Information 15, no. 4 (2024): 190. http://dx.doi.org/10.3390/info15040190.

Full text
Abstract:
This paper explores the practical implementation and performance analysis of distributed learning (DL) frameworks on various client platforms, responding to the dynamic landscape of 6G technology and the pressing need for a fully connected distributed intelligence network for Internet of Things (IoT) devices. The heterogeneous nature of clients and data presents challenges for effective federated learning (FL) techniques, prompting our exploration of federated transfer learning (FTL) on Raspberry Pi, Odroid, and virtual machine platforms. Our study provides a detailed examination of the design
APA, Harvard, Vancouver, ISO, and other styles
43

Olagunju, Funminiyi. "Federated Learning in the Era of Data Privacy: An Exhaustive Survey of Privacy Preserving Techniques, Legal Frameworks, and Ethical Considerations." International Journal of Future Engineering Innovations 2, no. 3 (2025): 153–60. https://doi.org/10.54660/ijfei.2025.2.3.153-160.

Full text
Abstract:
Federated Learning (FL) has emerged as a transformative approach to decentralized machine learning, enabling model training across multiple devices without centralizing sensitive data. While FL inherently supports privacy, growing concerns around data security, regulatory compliance, and ethical accountability have led to the development of advanced privacy preserving mechanisms. This systematic review, conducted in adherence with PRISMA guidelines, explores the landscape of privacy enhancing techniques, legal regulations, and ethical implications associated with Federated Learning. We sourced
APA, Harvard, Vancouver, ISO, and other styles
44

Zhang, Yu, Xiaowei Peng, and Hequn Xian. "pFedBASC: Personalized Federated Learning with Blockchain-Assisted Semi-Centralized Framework." Future Internet 16, no. 5 (2024): 164. http://dx.doi.org/10.3390/fi16050164.

Full text
Abstract:
As network technology advances, there is an increasing need for a trusted new-generation information management system. Blockchain technology provides a decentralized, transparent, and tamper-proof foundation. Meanwhile, data islands have become a significant obstacle for machine learning applications. Although federated learning (FL) ensures data privacy protection, server-side security concerns persist. Traditional methods have employed a blockchain system in FL frameworks to maintain a tamper-proof global model database. In this context, we propose a novel personalized federated learning (p
APA, Harvard, Vancouver, ISO, and other styles
45

Balaji, Soundararajan. "Designing Federated Learning Systems for Collaborative Financial Analytics." International Journal of Leading Research Publication 5, no. 2 (2024): 1–12. https://doi.org/10.5281/zenodo.15051139.

Full text
Abstract:
Federated Learning (FL) has emerged as a transformative paradigm for privacy-preserving collaborative machine learning, particularly in the financial sector, where data privacy and regulatory compliance are paramount. By enabling decentralized model training across distributed datasets without centralized data aggregation, FL addresses critical challenges in financial analytics, such as fraud detection, risk assessment, credit scoring, and cross-institutional insights. We will explore the principles, applications, and challenges of FL in finance, emphasizing its potential to enhance model robu
APA, Harvard, Vancouver, ISO, and other styles
46

Nagaraj Naik, Vikranth B M. "Federated Learning: A Comprehensive Survey on Types, Applications, Challenges, and Future Directions." Communications on Applied Nonlinear Analysis 32, no. 9s (2025): 2089–99. https://doi.org/10.52783/cana.v32.4450.

Full text
Abstract:
Federated Learning (FL) is a new machine learning (ML) paradigm where multiple parties jointly train a model without sharing each other’s raw data, which is beneficial for predictive analytics with privacy protection and low data transmission cost. This work surveys the basic concepts in FL such as the three categories of FL: horizontal FL, vertical FL, and federated transfer learning, which are suitable for different types of data partitioning. With that in mind, FL is fitting for applications in healthcare, finance, edge computing, and IoT devices as it is able to maintain privacy-aware AI d
APA, Harvard, Vancouver, ISO, and other styles
47

Moshawrab, Mohammad, Mehdi Adda, Abdenour Bouzouane, Hussein Ibrahim, and Ali Raad. "A Maneuver in the Trade-Off Space of Federated Learning Aggregation Frameworks Secured with Polymorphic Encryption: PolyFLAM and PolyFLAP Frameworks." Electronics 13, no. 18 (2024): 3716. http://dx.doi.org/10.3390/electronics13183716.

Full text
Abstract:
Maintaining user privacy in machine learning is a critical concern due to the implications of data collection. Federated learning (FL) has emerged as a promising solution by sharing trained models rather than user data. However, FL still faces several challenges, particularly in terms of security and privacy, such as vulnerability to inference attacks. There is an inherent trade-off between communication traffic across the network and computational costs on the server or client, which this paper aims to address by maneuvering between these performance parameters. To tackle these issues, this p
APA, Harvard, Vancouver, ISO, and other styles
48

Praveen, Kumar Myakala, Naayini Prudhvi, and Kamatala Srikanth. "A Survey on Federated Learning for TinyML: Challenges, Techniques, and Future Directions." Partners Universal International Innovation Journal (PUIIJ) 03, no. 02 (2025): 97–114. https://doi.org/10.5281/zenodo.15240508.

Full text
Abstract:
The convergence of Federated Learning (FL) and Tiny Machine Learning (TinyML) represents a transformative step toward enabling intelligent and privacy-preserving applications on resource-constrained edge devices. TinyML focuses on deploying lightweight machine learning models on microcontrollers and other low-power devices, whereas FL facilitates decentralized learning across distributed datasets without compromising user privacy. This survey provides a comprehensive review of the current state of research at the intersection of FL and TinyML, exploring model optimization techniques such as qu
APA, Harvard, Vancouver, ISO, and other styles
49

Qi, Tao, Huili Wang, and Yongfeng Huang. "Towards the Robustness of Differentially Private Federated Learning." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 18 (2024): 19911–19. http://dx.doi.org/10.1609/aaai.v38i18.29967.

Full text
Abstract:
Robustness and privacy protection are two important factors of trustworthy federated learning (FL). Existing FL works usually secure data privacy by perturbing local model gradients via the differential privacy (DP) technique, or defend against poisoning attacks by filtering the local gradients in the outlier of the gradient distribution before aggregation. However, these two issues are often addressed independently in existing works, and how to secure federated learning in both privacy and robustness still needs further exploration. In this paper, we unveil that although DP noisy perturbation
APA, Harvard, Vancouver, ISO, and other styles
50

Zamboni, Camilla. "Language, Play, Storytelling: Tabletop Role-Playing Games in the Italian L2 Classroom." Italica 101, no. 1 (2024): 149–70. https://doi.org/10.5406/23256672.101.1.09.

Full text
Abstract:
Abstract Within the larger context of gamification and game-based learning, in this article I argue that particularly tabletop role-playing games (TTRPGs), which are centered around storytelling and communication, are effective tools that second and foreign language (L2/FL) instructors can use in their pedagogical planning as both frameworks for thinking about language learning and as practical affordances to leverage in class interaction. I will define what TTRPGs are and how they can be beneficial for language learning; I will discuss ways to implement TTRPGs in our curricula in various form
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!