Deep Federated Learning using Transformers
Title: Deep Federated Learning using Transformers
DNr: Berzelius-2023-223
Project Type: LiU Berzelius
Principal Investigator: Sargam Gupta <sargam.gupta@umu.se>
Affiliation: Umeå universitet
Duration: 2023-09-13 – 2024-04-01
Classification: 10201
Keywords:

Abstract

Energy is a a crucial resource in recent times. Building energy consumption has constantly increased in recent years, this increase comprises approximately 40% of the total energy consumption in developed countries. Around 50% of building energy consumption is used for heating, cooling and domestic water heating in developing countries. Since this data is huge, it can be difficult to process that data centrally. Also, collecting data from different clients may cause pivacy threats, hence, to preserve the customers’ privacy, federated learning (FL) can be used to build a global energy forecasting model where customers train local models on their data and only send the models’ parameters to the server. I plan to use the Temporal Fusion Transformers for the client-side processing as they are specifically designed for the time series data prediction. For the server-side aggregation, I plan to use the well-established FedAvg algorithm for aggregation. I also plan to run my experiments on public datasets like energy dataset, electricity datasets and Taylor dataset. I also plan to incorporate Differential Privacy while sharing the model parameters with the server. Using accurate energy demand prediction techniques, we can use historic energy data to predict future consumption needs that can be utilized by different industries to manage the supply.