International Journal Papier Volume 4, Issue 3(Page 001-007)
Advance and Scientific Review ISSN: 2709-0248
1
Copyright © 2023, International Journal Papier Advance and Scientific Review, Under the license
CC BY-SA 4.0
DOI: https://doi.org/10.47667/ijpasr.v4i3.235
Received : September 7, 2023
Received in Revised: October 25, 2023
Accepted: October 28, 2023
Secure Federated Learning with a Homomorphic Encryption
Model
Yasmin Makki Mohialden
1
, Nadia Mahmood Hussien
1
, Saba Abdulbaqi Salman
2
,
Mohammad Aljanabi
2
1
Department of Computer Science, College of Science, Mustansiriyah University, Baghdad,
Iraq
2
Department of Computer Science, College of Education, Al-Iraqia University, Baghdad, Iraq
Abstract
Federated learning (FL) offers collaborative machine learning across decentralized devices
while safeguarding data privacy. However, data security and privacy remain key concerns. This
paper introduces "Secure Federated Learning with a Homomorphic Encryption Model,"
addressing these challenges by integrating homomorphic encryption into FL. The model starts
by initializing a global machine learning model and generating a homomorphic encryption key
pair, with the public key shared among FL participants. Using this public key, participants then
collect, preprocess, and encrypt their local data. During FL Training Rounds, participants
decrypt the global model, compute local updates on encrypted data, encrypt these updates, and
securely send them to the aggregator. The aggregator homomorphic ally combines updates
without revealing participant data, forwarding the encrypted aggregated update to the global
model owner. The Global Model Update ensures the owner decrypts the aggregated update
using the private key, updates the global model, encrypts it with the public key, and shares the
encrypted global model with FL participants. With optional model evaluation, training can
iterate for several rounds or until convergence. This model offers a robust solution to Florida
data privacy and security issues, with versatile applications across domains. This paper presents
core model components, advantages, and potential domain-specific implementations while
making significant strides in addressing FL's data privacy concerns.
Keywords: Secure Federated Learning; Homomorphic Encryption; Data Privacy; Model
Security; Collaborative Machine Learning
Introduction
Federated Learning (FL) has transformed collaborative machine learning by enabling
decentralized model training while safeguarding data privacy (Baracaldo & Shaul, 2023;
NVIDIA, 2022; Jin, 2023; Wibawa et al., 2023; IEEE, 2021). This paradigm holds promise
across diverse domains such as healthcare, finance, and the Internet of Things (IoT).
Nonetheless, Florida encounters formidable privacy and security challenges hindering its
widespread adoption. This paper introduces a pioneering approach, "Secure Federated
Learning with a Homomorphic Encryption Model," which confronts these challenges by
integrating homomorphic encryption techniques into the FL framework. Our approach
originally merges two robust technologies, federated learning, and homomorphic encryption,
yielding notable contributions: