Federated Learning of Support Vector Machines
Title: |
Federated Learning of Support Vector Machines |
DNr: |
NAISS 2024/5-113 |
Project Type: |
NAISS Medium Compute |
Principal Investigator: |
Alexander Schliep <alexander.schliep@cse.gu.se> |
Affiliation: |
Göteborgs universitet |
Duration: |
2024-03-27 – 2025-04-01 |
Classification: |
10201 |
Homepage: |
https://schlieplab.org/ |
Keywords: |
|
Abstract
The goal of this project is to research federated learning of machine learning algorithms in a distributed manner for training large-scale problems. In particular, we study how in the process of learning data privacy can be preserved under communication between local nodes. For this purpose, we propose an ADMM-based SVM with differential privacy. In addition, we investigate how accuracy will be influenced compared to the non-private algorithm for small and big actors. The initial communication in the network between agents is designed in a decentralized manner in which no master or a central agent controls the communication and each agent communicates with the one-hop neighbors. This is adapted in a distributed network-based SVMs algorithm.
We will compare several federated learning methods in terms of accuracy and CPU time. We will investigate a lower bound to the number of samples to be labeled to get good performance in which only a few agents communicate. We will conduct experiments to evaluate the effectiveness of the developed adaptive communication strategy and the proposed distributed multi-agent active learning on large-scale problems.