Federated Learning for Privacy-Preserving Distributed Model Training

Main Article Content

Giulia Bianchi

Abstract

Federated Learning (FL) has emerged as a promising approach for training machine learning models across decentralized devices without centralizing data. This paper explores the principles, challenges, and advancements in FL, focusing particularly on its role in privacy-preserving distributed model training. We discuss the fundamental concepts of FL, its architecture, and various strategies employed to ensure data privacy while aggregating model updates from multiple edge devices. Key challenges such as communication efficiency, heterogeneous data distributions, and security concerns are addressed alongside state-of-the-art solutions and future research directions.

Downloads

Download data is not yet available.

Article Details

How to Cite
Federated Learning for Privacy-Preserving Distributed Model Training. (2024). Innovative Computer Sciences Journal, 10(1), 1−6. http://innovatesci-publishers.com/index.php/ICSJ/article/view/143
Section
Articles

How to Cite

Federated Learning for Privacy-Preserving Distributed Model Training. (2024). Innovative Computer Sciences Journal, 10(1), 1−6. http://innovatesci-publishers.com/index.php/ICSJ/article/view/143