An unofficial PyTorch implementation of a federated learning algorithm, FedAvg.

Overview

Federated Averaging (FedAvg) in PyTorch arXiv

An unofficial implementation of FederatedAveraging (or FedAvg) algorithm proposed in the paper Communication-Efficient Learning of Deep Networks from Decentralized Data in PyTorch. (implemented in Python 3.9.2.)

Implementation points

  • Exactly implement the models ('2NN' and 'CNN' mentioned in the paper) to have the same number of parameters written in the paper.
    • 2NN: TwoNN class in models.py; 199,210 parameters
    • CNN: CNN class in models.py; 1,663,370 parameters
  • Exactly implement the non-IID data split.
    • Each client has at least two digits in case of using MNIST dataset.
  • Implement multiprocessing of client update and client evaluation.
  • Support TensorBoard for log tracking.

Requirements

  • See requirements.txt

Configurations

  • See config.yaml

Run

  • python3 main.py

Results

MNIST

  • Number of clients: 100 (K = 100)
  • Fraction of sampled clients: 0.1 (C = 0.1)
  • Number of rounds: 500 (R = 500)
  • Number of local epochs: 10 (E = 10)
  • Batch size: 10 (B = 10)
  • Optimizer: torch.optim.SGD
  • Criterion: torch.nn.CrossEntropyLoss
  • Learning rate: 0.01
  • Momentum: 0.9
  • Initialization: Xavier

Table 1. Final accuracy and the best accuracy

Model Final Accuracy(IID) (Round) Best Accuracy(IID) (Round) Final Accuracy(non-IID) (Round) Best Accuracy(non-IID) (Round)
2NN 98.38% (500) 98.45% (483) 97.50% (500) 97.65% (475)
CNN 99.31% (500) 99.34% (197) 98.73% (500) 99.28% (493)

Table 2. Final loss and the least loss

Model Final Loss(IID) (Round) Least Loss(IID) (Round) Final Loss(non-IID) (Round) Least Loss(non-IID) (Round)
2NN 0.09296 (500) 0.06956 (107) 0.09075 (500) 0.08257 (475)
CNN 0.04781 (500) 0.02497 (86) 0.04533 (500) 0.02413 (366)

Figure 1. MNIST 2NN model accuracy (IID: top / non-IID: bottom) iidmnist run-Accuracy_ MNIST _TwoNN C_0 1, E_10, B_10, IID_False-tag-Accuracy

Figure 2. MNIST CNN model accuracy (IID: top / non-IID: bottom) run-Accuracy_ MNIST _CNN C_0 1, E_10, B_10, IID_True-tag-Accuracy Accuracy

TODO

  • Do CIFAR experiment (CIFAR10 dataset) & large-scale LSTM experiment (Shakespeare dataset)
  • Learning rate scheduling
  • More experiments with other hyperparameter settings (e.g., different combinations of B, E, K, and C)
Owner
Seok-Ju Hahn
atta-dipa dhamma-dipa
Seok-Ju Hahn
MobileNetV1-V2,MobileNeXt,GhostNet,AdderNet,ShuffleNetV1-V2,Mobile+ViT etc.

MobileNetV1-V2,MobileNeXt,GhostNet,AdderNet,ShuffleNetV1-V2,Mobile+ViT etc. ⭐⭐⭐⭐⭐

568 Jan 04, 2023
[Open Source]. The improved version of AnimeGAN. Landscape photos/videos to anime

[Open Source]. The improved version of AnimeGAN. Landscape photos/videos to anime

CC 4.4k Dec 27, 2022
we propose EfficientDerain for high-efficiency single-image deraining

EfficientDerain we propose EfficientDerain for high-efficiency single-image deraining Requirements python 3.6 pytorch 1.6.0 opencv-python 4.4.0.44 sci

Qing Guo 126 Dec 07, 2022
Accepted at ICCV-2021: Workshop on Computer Vision for Automated Medical Diagnosis (CVAMD)

Is it Time to Replace CNNs with Transformers for Medical Images? Accepted at ICCV-2021: Workshop on Computer Vision for Automated Medical Diagnosis (C

Christos Matsoukas 80 Dec 27, 2022
Imbalanced Gradients: A Subtle Cause of Overestimated Adversarial Robustness

Imbalanced Gradients: A Subtle Cause of Overestimated Adversarial Robustness Code for Paper "Imbalanced Gradients: A Subtle Cause of Overestimated Adv

Hanxun Huang 11 Nov 30, 2022
Resco: A simple python package that report the effect of deep residual learning

resco Description resco is a simple python package that report the effect of dee

Pierre-Arthur Claudé 1 Jun 28, 2022
PyKale is a PyTorch library for multimodal learning and transfer learning as well as deep learning and dimensionality reduction on graphs, images, texts, and videos

PyKale is a PyTorch library for multimodal learning and transfer learning as well as deep learning and dimensionality reduction on graphs, images, texts, and videos. By adopting a unified pipeline-ba

PyKale 370 Dec 27, 2022
A higher performance pytorch implementation of DeepLab V3 Plus(DeepLab v3+)

A Higher Performance Pytorch Implementation of DeepLab V3 Plus Introduction This repo is an (re-)implementation of Encoder-Decoder with Atrous Separab

linhua 326 Nov 22, 2022
Generating Digital Painting Lighting Effects via RGB-space Geometry (SIGGRAPH2020/TOG2020)

Project PaintingLight PaintingLight is a project conducted by the Style2Paints team, aimed at finding a method to manipulate the illumination in digit

651 Dec 29, 2022
Outlier Exposure with Confidence Control for Out-of-Distribution Detection

OOD-detection-using-OECC This repository contains the essential code for the paper Outlier Exposure with Confidence Control for Out-of-Distribution De

Nazim Shaikh 64 Nov 02, 2022
Python scripts form performing stereo depth estimation using the high res stereo model in PyTorch .

PyTorch-High-Res-Stereo-Depth-Estimation Python scripts form performing stereo depth estimation using the high res stereo model in PyTorch. Stereo dep

Ibai Gorordo 26 Nov 24, 2022
NALSM: Neuron-Astrocyte Liquid State Machine

NALSM: Neuron-Astrocyte Liquid State Machine This package is a Tensorflow implementation of the Neuron-Astrocyte Liquid State Machine (NALSM) that int

Computational Brain Lab 4 Nov 28, 2022
Deep ViT Features as Dense Visual Descriptors

dino-vit-features [paper] [project page] Official implementation of the paper "Deep ViT Features as Dense Visual Descriptors". We demonstrate the effe

Shir Amir 113 Dec 24, 2022
PaRT: Parallel Learning for Robust and Transparent AI

PaRT: Parallel Learning for Robust and Transparent AI This repository contains the code for PaRT, an algorithm for training a base network on multiple

Mahsa 0 May 02, 2022
PICARD - Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models

This is the official implementation of the following paper: Torsten Scholak, Nathan Schucher, Dzmitry Bahdanau. PICARD - Parsing Incrementally for Con

ElementAI 217 Jan 01, 2023
Config files for my GitHub profile.

Canalyst Candas Data Science Library Name Canalyst Candas Description Built by a former PM / analyst to give anyone with a little bit of Python knowle

Canalyst Candas 13 Jun 24, 2022
The official implementation of paper "Finding the Task-Optimal Low-Bit Sub-Distribution in Deep Neural Networks" (IJCV under review).

DGMS This is the code of the paper "Finding the Task-Optimal Low-Bit Sub-Distribution in Deep Neural Networks". Installation Our code works with Pytho

Runpei Dong 3 Aug 28, 2022
[TIP 2021] SADRNet: Self-Aligned Dual Face Regression Networks for Robust 3D Dense Face Alignment and Reconstruction

SADRNet Paper link: SADRNet: Self-Aligned Dual Face Regression Networks for Robust 3D Dense Face Alignment and Reconstruction Requirements python

Multimedia Computing Group, Nanjing University 99 Dec 30, 2022
Easily Process a Batch of Cox Models

ezcox: Easily Process a Batch of Cox Models The goal of ezcox is to operate a batch of univariate or multivariate Cox models and return tidy result. ⏬

Shixiang Wang 15 May 23, 2022
EPSANet:An Efficient Pyramid Split Attention Block on Convolutional Neural Network

EPSANet:An Efficient Pyramid Split Attention Block on Convolutional Neural Network This repo contains the official Pytorch implementaion code and conf

Hu Zhang 175 Jan 07, 2023