Concerns about privacy are crucial in the data-driven healthcare
industry of today. Federated Learning (FL) lowers the danger of data
breaches by facilitating cooperative model training without exchanging
raw patient data. Differential Privacy (DP), which introduces noise into
model updates to protect patient data, improves FL’s decentralized
methodology. This is particularly useful for applications like early
cardiovascular disease detection, allowing accurate models while
maintaining privacy. Hospitals train models locally, sharing updates
with a central server that refines a global model. Challenges include
achieving model convergence and managing communication
overhead. Ongoing research aims to optimize these processes, ensuring
secure, privacy-preserving healthcare solutions.
R. Anusuya1, D. Karthika Renuka2, Ashok Kumar3, S.K. Prithika4, S. Mridula5, T. Subhaashini6, R. Tharsha7 PSG College of Technology, India1,2,4,5,6,7, Thiagarajar College of Engineering, India3
Federated Learning (FL), Differential Privacy (DP), Data-driven Healthcare, Privacy-preserving Solutions, Early Cardiovascular Disease Detection, Model Convergence, Communication Overhead
January | February | March | April | May | June | July | August | September | October | November | December |
4 | 4 | 3 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Published By : ICTACT
Published In :
ICTACT Journal on Soft Computing ( Volume: 15 , Issue: 3 , Pages: 3646 - 3652 )
Date of Publication :
January 2025
Page Views :
135
Full Text Views :
14
|