• Stars
    star
    163
  • Rank 231,141 (Top 5 %)
  • Language
    Python
  • License
    MIT License
  • Created about 3 years ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A modular PyTorch library for vision transformer models

VFormer

A modular PyTorch library for vision transformers models

Library Features

  • Contains implementations of prominent ViT architectures broken down into modular components like encoder, attention mechanism, and decoder
  • Makes it easy to develop custom models by composing components of different architectures
  • Contains utilities for visualizing attention maps of models using techniques such as gradient rollout

Installation

From source (recommended)

git clone https://github.com/SforAiDl/vformer.git
cd vformer/
python setup.py install

From PyPI

pip install vformer

Models supported

Example usage

To instantiate and use a Swin Transformer model -

import torch
from vformer.models.classification import SwinTransformer

image = torch.randn(1, 3, 224, 224)       # Example data
model = SwinTransformer(
        img_size=224,
        patch_size=4,
        in_channels=3,
        n_classes=10,
        embed_dim=96,
        depths=[2, 2, 6, 2],
        num_heads=[3, 6, 12, 24],
        window_size=7,
        drop_rate=0.2,
    )
logits = model(image)

VFormer has a modular design and allows for easy experimentation using blocks/modules of different architectures. For example, if desired, you can use just the encoder or the windowed attention layer of the Swin Transformer model.

from vformer.attention import WindowAttention

window_attn = WindowAttention(
        dim=128,
        window_size=7,
        num_heads=2,
        **kwargs,
    )
from vformer.encoder import SwinEncoder

swin_encoder = SwinEncoder(
        dim=128,
        input_resolution=(224, 224),
        depth=2,
        num_heads=2,
        window_size=7,
        **kwargs,
    )

Please refer to our documentation to know more.


References

More Repositories

1

KD_Lib

A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Python
593
star
2

Neural-Voice-Cloning-With-Few-Samples

This repository has implementation for "Neural Voice Cloning With Few Samples"
Python
428
star
3

genrl

A PyTorch reinforcement learning library for generalizable and reproducible algorithm implementations with an aim to improve accessibility in RL
Python
402
star
4

Deep-Learning-TIP

Jupyter Notebook
26
star
5

Summer-Induction-Assignment-2021

Repository for SAiDL Summer 2021 Induction Assignment
21
star
6

paper-reading-group

Notes for papers presented during our paper reading sessions
20
star
7

SAiDL-Summer-2023-Induction-Assignment

18
star
8

Playground

A python library consisting of pipelines for visual analysis of different sports using Computer Vision and Deep Learning.
Python
18
star
9

CountCLIP

Jupyter Notebook
16
star
10

jeta

A Jax based meta learning library
Python
16
star
11

decepticonlp

Python Library for Robustness Monitoring and Adversarial Debugging of NLP models
Python
15
star
12

Summer-Induction-Assignment-2020

Repository for SAiDL Summer Assignment 2020
Python
14
star
13

SAiDL-Spring-2024-Induction-Assignment

13
star
14

SAiDL-Spring-2022-Induction-Assignment

Repository for SAiDL Spring 2022 Induction Assignment
12
star
15

neuroscience-ai-reading-course

Notes for the Neuroscience & AI Reading Course (SEM-I 2020-21) at BITS Pilani Goa Campus
12
star
16

saliency_estimation

Python library to estimate saliency
Python
8
star
17

SAiDL-Season-of-Code

6
star
18

twitter-sanity

A python tool to recommend relevant and important tweets from your Twitter feed.
Python
6
star
19

evis

A utility Python library for event-based vision
Python
5
star
20

NeurIPS2020

5
star
21

Winter-Assignment-2019

Winter Assignment 2019
2
star
22

Bootcamp

Python + ML Bootcamp
Jupyter Notebook
2
star
23

Winter-Assignment-2018

1
star
24

blogs

1
star