• Stars
    star
    141
  • Rank 258,483 (Top 6 %)
  • Language
    Python
  • License
    MIT License
  • Created almost 5 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

[ICLR 2021] HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients

HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients

[ICLR 2021] This is an implementation of HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients

  • Global model parametersWgare distributed to m=6 local clients with p=3 computation complexity levels.

Requirements

  • see requirements.txt

Instruction

  • Global hyperparameters are configured in config.yml
  • Hyperparameters can be found at process_control() in utils.py
  • fed.py contrains aggregation and separation of subnetworks

Examples

  • Train MNIST dataset (IID) with CNN model, 100 users, active rate 0.1, model split 'Fix', model split mode 'a-b (20%-80%)', BatchNorm, Scaler (True) , Masked CrossEntropy (True)
    python train_classifier_fed.py --data_name MNIST --model_name conv --control_name 1_100_0.1_iid_fix_a2-b8_bn_1_1
  • Train CIFAR10 dataset (Non-IID 2 classes) with ResNet model, 10 users, active rate 0.1, model split 'Dynamic', model split mode 'a-b-c (uniform)', GroupNorm, Scaler (False) , Masked CrossEntropy (False)
    python train_classifier_fed.py --data_name CIFAR10 --model_name resnet18 --control_name 1_10_0.1_non-iid-2_dynamic_a1-b1-c1_gn_0_0
  • Test WikiText2 dataset with Transformer model, 100 users, active rate 0.01, model split 'Fix', model split mode 'a (50%), b(50%)', No Normalization, Scaler (True) , Masked CrossEntropy (False)
    python test_transformer_fed.py --data_name WikiText2 --model_name transformer --control_name 1_100_0.01_iid_fix_a5-b5_none_1_0

Results

  • Interpolation experimental results for MNIST (IID) dataset between global model complexity ((a) a, (b) b, (c) c, (d) d) and various smaller model complexities.

MNIST_interp_iid

  • Interpolation experimental results for CIFAR10 (IID) dataset between global model complexity ((a) a, (b) b, (c) c, (d) d) and various smaller model complexities.

CIFAR10_interp_iid

  • Interpolation experimental results for WikiText2 (IID) dataset between global model complexity ((a) a, (b) b, (c) c, (d) d) and various smaller model complexities.

WikiText2_interp_iid

Acknowledgement

Enmao Diao
Jie Ding
Vahid Tarokh

More Repositories

1

FPGA-CNN

FPGA implementation of Cellular Neural Network (CNN)
Verilog
133
star
2

SemiFL-Semi-Supervised-Federated-Learning-for-Unlabeled-Clients-with-Alternate-Training

[NeurIPS 2022] SemiFL: Semi-Supervised Federated Learning for Unlabeled Clients with Alternate Training
Python
32
star
3

Library-Management-System

A Library Management System with PHP and MySQL
PHP
27
star
4

Pruning-Deep-Neural-Networks-from-a-Sparsity-Perspective

[ICLR 2023] Pruning Deep Neural Networks from a Sparsity Perspective
Python
20
star
5

RPipe

Research Pipeline (RPipe)
Python
20
star
6

Speech-Emotion-Recognition-with-Dual-Sequence-LSTM-Architecture

[ICASSP 2020] Speech Emotion Recognition with Dual-Sequence LSTM Architecture
Python
12
star
7

Belief-Propagation

Implementation of Generalized Belief Propagation and Convergence Rate Analysis
MATLAB
10
star
8

MIREX-Audio-Melody-Extraction-Data-Analysis

Music Information Retrieval Evaluation eXchange (MIREX) Audio Melody Extraction Data Analysis during 2009-2014
R
8
star
9

Monophonic-Pitch-Tracking

Monophonic Ptich Tracking algorithms
HTML
7
star
10

GAL-Gradient-Assisted-Learning-for-Decentralized-Multi-Organization-Collaborations

[NeurIPS 2022] GAL: Gradient Assisted Learning for Decentralized Multi-Organization Collaborations
Python
7
star
11

Dimension-Reduced-Turbulent-Flow-Data-From-Deep-Vector-Quantizers

[Journal of Turbulence, DCC 2022] Dimension Reduced Turbulent Flow Data From Deep Vector Quantizers
Python
4
star
12

DRASIC-Distributed-Recurrent-Autoencoder-for-Scalable-Image-Compression

[DCC 2020] DRASIC: Distributed Recurrent Autoencoder for Scalable Image Compression
Python
4
star
13

Deep-Voice-Conversion

Python
3
star
14

Multimodal-Controller-for-Generative-Models

[CVMI 2022] Multimodal Controller for Generative Models
Python
3
star
15

ColA-Collaborative-Adaptation-with-Gradient-Learning

[arXiv] ColA: Collaborative Adaptation with Gradient Learning
Python
3
star
16

Computer-Simulation

Computer Simulation Projects originally from CX 4230
Jupyter Notebook
2
star
17

FLPipe

Federated Learning Pipeline
Python
1
star
18

Deep-Audio-Signal-Coding

Jupyter Notebook
1
star
19

Personalized-Federated-Recommender-Systems-with-Private-and-Partially-Federated-AutoEncoders

[Asilomar 2022] Personalized Federated Recommender Systems with Private and Partially Federated AutoEncoders
Python
1
star
20

LLM-for-Recommender-Systems

Python
1
star
21

Digital-Signal-Processing-Applications

Digital Signal Processing Applications originally from ECE 4271
MATLAB
1
star
22

Earthquake-Forecasting

Python
1
star
23

Semi-Supervised-Federated-Learing-for-Keyword-Spotting

[ICME 2023] Semi-Supervised Federated Learing for Keyword Spotting
Python
1
star
24

STF-Ivy-Dream-Works-MBTI-Test

A website for testing academic abilities and psychological tendencies for STF Ivy Dream Works
PowerShell
1
star