• Stars
    star
    281
  • Rank 147,023 (Top 3 %)
  • Language
    Python
  • Created over 3 years ago
  • Updated almost 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Self-attention、Non-local、SE、SK、CBAM、DANet

Attention-mechanism-implementation

pytorch for Self-attention、Non-local、SE、SK、CBAM、DANet

According to the different application domains of the attention mechanism, that is, the different ways and positions of attention weights are applied, the article divides the attention mechanism into spatial domain, channel domain and hybrid domain, and introduces some advanced aspects of these different attentions. Attention model, carefully analyzed their design methods and application fields, and finally proved the effectiveness of these attention mechanisms and the improvement of the results brought by CV tasks with experimental methods.

  1. Spatial attention method

1.1 Self-Attention

image

1.2 Non-local Attention

image

  1. Channel domain attention method

2.1 SENet

image

2.2 SKNet

image

  1. Hybrid domain attention method

3.1 CBAM

image image

3.2 DANet

image

  1. RESULT For each set of experiments, we use Resnet18 as the Baseline, training 160 epoch, the initial learning rate is 0.1, 80 epoch is adjusted to 0.01, and 160 epoch is adjusted to 0.001. The batch size is set to 128, and the SGD optimizer with momentum is experimented. When reading the input, first perform random cropping and random flipping data enhancement. In particular, in order to maximize the attention effect, we all perform a warm-up operation of 1 epoch at the beginning of the experiment, and take the average of the best 5 epochs as the final result. image

reference Self-Attention Non-local Attention SENet SKNet CBAM DANet