• Stars
    star
    185
  • Rank 207,025 (Top 5 %)
  • Language
    Python
  • Created over 1 year ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Official code for "TOAST: Transfer Learning via Attention Steering"

TOAST: Top-Down Attention Steering

This is the official codebase of TOAST, from the following paper:

TOAST: Transfer Learning via Attention Steering
Baifeng Shi, Siyu Gai, Trevor Darrell, and Xin Wang
UC Berkeley, Microsoft Research

drawing

Motivation

We find previous transfer learning methods (fine-tuning, LoRA, prompt tuning, etc.) often fail to focus on the features relevant to the downstream tasks (see figure above). We show that refocusing the attention to task-relevant features can improve downstream performances.

What is TOAST?

TOAST is a transfer learning algorithm which transfers a large pre-trained model to a downstream task by refocusing the model's attention to task-relevant features. Specifically, TOAST freezes the pre-trained backbone and tunes a top-down attention module to refocus the attention (see figure below).

drawing

This repo contains

  • visual_classification: TOAST for visual classification (including transfer learning on FGVC and VTAB)
  • language_generation: TOAST for language generation (including transfer learning on Alpaca)

This codebase is largely built upon

Citation

If you found this code helpful, please consider citing our work:

@article{shi2023toast,
  title={TOAST: Transfer Learning via Attention Steering},
  author={Shi, Baifeng and Gai, Siyu and Darrell, Trevor and Wang, Xin},
  journal={arXiv preprint arXiv:2305.15542},
  year={2023}
}