Optimizing Federated Learning using Remote Embeddings for Graph Neural Networks

Aug 2024

Optimizing Federated Learning using Remote Embeddings for Graph Neural Networks

Authors: Pranjal Naman and Yogesh Simmhan,

Graph Neural Networks (GNNs) have experienced rapid advancements in recent years due to their ability to learn meaningful representations from graph data structures. Federated Learning (FL) has emerged as a viable machine learning approach for training a shared model on decentralized data, addressing privacy concerns while leveraging parallelism. Existing methods that address the unique requirements of federated GNN training using remote embeddings to enhance convergence accuracy are limited by their diminished performance due to large communication costs with a shared embedding server. In this paper, we present OpES, an optimized federated GNN training framework that uses remote neighbourhood pruning, and overlaps pushing of embeddings to the server with local training to reduce the network costs and training time. The modest drop in per-round accuracy due to pre-emptive push of embeddings is out-stripped by the reduction in per-round training time for large and dense graphs like Reddit and Products, converging up to faster than the state-of-the-art technique using an embedding server and giving up to better accuracy than vanilla federated GNN learning.

Journal/Conference

30th International European Conference on Parallel and Distributed Computing (EuroPar), 2024