Svetlana Boudko

Norwegian Computing Center, Oslo, Norway
women in a green blouse
© Svetlana Boudko
Where federated learning meets homomorphic encryption: challenges and potential pathways for secure data sharing in AI applications
Thu 01 Aug | 14:45 - 16:15 | SR05

Svetlana Boudko 01/01

Dr. Svetlana Boudko is a Senior Research Scientist at the Norwegian Computing Center, Oslo, Norway. She defended her PhD in computer science at the University of Oslo in 2014. She has over 20 years of experience working on R&D projects. She has served as a program committee member for several scientific journals and conferences. These include, among others, MDPI Sensors, MDPI Information, MDPI Electronics, MoMM (International Conference on Advances in Mobile Computing & Multimedia), and GameSec (Conference on Decision and Game Theory for Security). She also served as a peer reviewer for the book: "Artificial Intelligence For Security: Enhancing Protection in a Changing World". Additionally, she serves as a chair for the SecHealth workshop. Her areas of interest include cybersecurity, privacy and data protection, secure multi-party computations, and federated learning.

Where federated learning meets homomorphic encryption: challenges and potential pathways for secure data sharing in AI applications

To ensure data consistency and control, data centralization is a preferred solution for training machine learning models. However, data protection regulations, e.g. GDPR, as well as industrial competition, impose restrictions on information sharing among different organizations and individuals. Furthermore, this approach is technically challenging since the cost of collecting, storing, and processing all data in one centralized location is often prohibitively high. Google proposed federated learning for the collaborative training of machine learning models, aiming to handle the exchange of privacy-sensitive information in distributed environments and to reduce data transmission costs. In contrast to traditional machine learning, federated learning does not require local data to be collected, stored, and processed on a central server. Instead, this method enables on-device model training using client-specific data, with the obtained local model updates further aggregated on a central server. However, federated learning is not without its own privacy concerns, including risks of data leakage and inference attacks. To address these challenges, research is being conducted into various strategies, such as homomorphic encryption. By combining federated learning and homomorphic encryption, we can train machine learning models on encrypted data from different sources, thereby ensuring better data protection. The model never sees the raw data, only the encrypted version, and yet it can still learn from it. However, homomorphic encryption is computationally intensive and can significantly slow down the training process. In this talk, I take a look at the issues and prospects arising from the intersection of federated learning and multi-key homomorphic encryption, two advanced techniques in the field of secure and collaborative machine learning.


4th Workshop on Cybersecurity in Healthcare 4.0
Register here!
Join us at ARES 2024 in Vienna, Austria