Skip to content

👊 计算机视觉中用到的注意力模块和其他即插即用模块PyTorch Implementation Collection of Attention Module and Plug&Play Module

License

Notifications You must be signed in to change notification settings

CharlesChen24/awesome-attention-mechanism-in-cv

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

70 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Awesome-Attention-Mechanism-in-cv Awesome

Table of Contents

Introduction

PyTorch实现多种计算机视觉中网络设计中用到的Attention机制,还收集了一些即插即用模块。由于能力有限精力有限,可能很多模块并没有包括进来,有任何的建议或者改进,可以提交issue或者进行PR。

Attention Mechanism

Paper Publish Link Main Idea Blog
Global Second-order Pooling Convolutional Networks CVPR19 GSoPNet 将高阶和注意力机制在网络中部地方结合起来
Neural Architecture Search for Lightweight Non-Local Networks CVPR20 AutoNL NAS+LightNL
Squeeze and Excitation Network CVPR18 SENet 最经典的通道注意力 zhihu
Selective Kernel Network CVPR19 SKNet SE+动态选择 zhihu
Convolutional Block Attention Module ECCV18 CBAM 串联空间+通道注意力 zhihu
BottleNeck Attention Module BMVC18 BAM 并联空间+通道注意力 zhihu
Concurrent Spatial and Channel ‘Squeeze & Excitation’ in Fully Convolutional Networks MICCAI18 scSE 并联空间+通道注意力 zhihu
Non-local Neural Networks CVPR19 Non-Local(NL) self-attention zhihu
GCNet: Non-local Networks Meet Squeeze-Excitation Networks and Beyond ICCVW19 GCNet 对NL进行改进 zhihu
CCNet: Criss-Cross Attention for Semantic Segmentation ICCV19 CCNet 对NL改进
SA-Net:shuffle attention for deep convolutional neural networks ICASSP 21 SANet SGE+channel shuffle zhihu
ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks CVPR20 ECANet SE的改进
Spatial Group-wise Enhance: Improving Semantic Feature Learning in Convolutional Networks CoRR19 SGENet Group+spatial+channel
FcaNet: Frequency Channel Attention Networks CoRR20 FcaNet 频域上的SE操作
$A^2\text{-}Nets$: Double Attention Networks NeurIPS18 DANet NL的思想应用到空间和通道
Asymmetric Non-local Neural Networks for Semantic Segmentation ICCV19 APNB spp+NL
Efficient Attention: Attention with Linear Complexities CoRR18 EfficientAttention NL降低计算量
Image Restoration via Residual Non-local Attention Networks ICLR19 RNAN
Exploring Self-attention for Image Recognition CVPR20 SAN 理论性很强,实现起来很简单
An Empirical Study of Spatial Attention Mechanisms in Deep Networks ICCV19 None MSRA综述self-attention
Object-Contextual Representations for Semantic Segmentation ECCV20 OCRNet 复杂的交互机制,效果确实好
IAUnet: Global Context-Aware Feature Learning for Person Re-Identification TTNNLS20 IAUNet 引入时序信息
ResNeSt: Split-Attention Networks CoRR20 ResNeSt SK+ResNeXt
Gather-Excite: Exploiting Feature Context in Convolutional Neural Networks NeurIPS18 GENet SE续作
Improving Convolutional Networks with Self-calibrated Convolutions CVPR20 SCNet 自校正卷积
Rotate to Attend: Convolutional Triplet Attention Module WACV21 TripletAttention CHW两两互相融合
Dual Attention Network for Scene Segmentation CVPR19 DANet self-attention
Relation-Aware Global Attention for Person Re-identification CVPR20 RGANet 用于reid
Attentional Feature Fusion WACV21 AFF 特征融合的attention方法
An Attentive Survey of Attention Models CoRR19 None 包括NLP/CV/推荐系统等方面的注意力机制
Stand-Alone Self-Attention in Vision Models NeurIPS19 FullAttention 全部的卷积都替换为self-attention
BiSeNet: Bilateral Segmentation Network for Real-time Semantic Segmentation ECCV18 BiSeNet 类似FPN的特征融合方法 zhihu
DCANet: Learning Connected Attentions for Convolutional Neural Networks CoRR20 DCANet 增强attention之间信息流动
An Empirical Study of Spatial Attention Mechanisms in Deep Networks ICCV19 None 对空间注意力进行针对性分析
Look closer to see better: Recurrent attention convolutional neural network for fine-grained image recognition CVPR17 Oral RA-CNN 细粒度识别
Guided Attention Network for Object Detection and Counting on Drones ACM MM20 GANet 处理目标检测问题
Attention Augmented Convolutional Networks ICCV19 AANet 多头+引入额外特征映射
GLOBAL SELF-ATTENTION NETWORKS FOR IMAGE RECOGNITION ICLR21 GSA 新的全局注意力模块
Attention-Guided Hierarchical Structure Aggregation for Image Matting CVPR20 HAttMatting 抠图方面的应用,高层使用通道注意力机制,然后再使用空间注意力机制指导低层。
Weight Excitation: Built-in Attention Mechanisms in Convolutional Neural Networks ECCV20 None 与SE互补的权值激活机制
Expectation-Maximization Attention Networks for Semantic Segmentation ICCV19 Oral EMANet EM+Attention
Dense-and-implicit attention network AAAI 20 DIANet LSTM+全程SE注意力
Coordinate Attention for Efficient Mobile Network Design CVPR21 CoordAttention 横向、竖向
Cross-channel Communication Networks NIPS19 C3Net GNN+SE
Gated Convolutional Networks with Hybrid Connectivity for Image Classification AAAI20 HCGNet 引入了LSTM的部分概念
Weighted Channel Dropout for Regularization of Deep Convolutional Neural Network AAAI19 None Dropout+SE
BA^2M: A Batch Aware Attention Module for Image Classification CVPR21 None Batch之间建立attention
EPSANet:An Efficient Pyramid Split Attention Block on Convolutional Neural Network CoRR21 EPSANet 多尺度
Stand-Alone Self-Attention in Vision Models NIPS19 SASA Non-Local变体
ResT: An Efficient Transformer for Visual Recognition CoRR21 ResT self-attention变体
Spanet: Spatial Pyramid Attention Network for Enhanced Image Recognition ICME20 SPANet 多个AAP组成金字塔
Space-time Mixing Attention for Video Transformer CoRR21 X-VIT Not release VIT+时空attention
DMSANet: Dual Multi Scale Attention Network CoRR21 Not release yet 两尺度+轻量
CompConv: A Compact Convolution Module for Efficient Feature Learning CoRR21 Not release yet res2net+ghostnet
VOLO: Vision Outlooker for Visual Recognition CoRR21 VOLO ViT上的Attention
Interflow: Aggregating Multi-layer Featrue Mappings with Attention Mechanism CoRR21 Not release yet 辅助头级别attention
MUSE: Parallel Multi-Scale Attention for Sequence to Sequence Learning CoRR21 MUSE Attention NLP中对SA进行改进
Polarized Self-Attention: Towards High-quality Pixel-wise Regression CoRR21 PSA Pixel-wise regression
CA-Net: Comprehensive Attention Convolutional Neural Networks for Explainable Medical Image Segmentation TMI21 CA-Net Spatial Attention
BAM: A Lightweight and Efficient Balanced Attention Mechanism for Single Image Super Resolution CoRR21 BAM Super resolution
Attention as Activation CoRR21 ATAC activation + attention
Region-based Non-local Operation for Video Classification CoRR21 RNL video classification
MSAF: Multimodal Split Attention Fusion CoRR21 MSAF MultiModal
All-Attention Layer CoRR19 None Tranformer Layer
Compact Global Descriptor CoRR20 CGD add every two channel attention
SimAM: A Simple, Parameter-Free Attention Module for Convolutional Neural Networks ICML21 SimAM 类脑计算神经元能量
Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks With Octave Convolution ICCV19 OctConv 从频率角度改进
Contextual Transformer Networks for Visual Recognition ICCV2021 CoTNet 虽然宣称Transformer改进,但实际上就是non-local非常接近
Residual Attention: A Simple but Effective Method for Multi-Label Recognition ICCV2021 CSRA 用于多标签图像识别任务
Self-supervised Equivariant Attention Mechanism for Weakly Supervised Semantic Segmentation CVPR2020 SEAM 弱监督
An Attention Module for Convolutional Neural Networks ICCV2021 AW-Conv 提升了SE部分的容量

Plug and Play Module

  • ACBlock
  • Swish、wish Activation
  • ASPP Block
  • DepthWise Convolution
  • Fused Conv & BN
  • MixedDepthwise Convolution
  • PSP Module
  • RFBModule
  • SematicEmbbedBlock
  • SSH Context Module
  • Some other usefull tools such as concate feature map、flatten feature map
  • WeightedFeatureFusion:EfficientDet中的FPN用到的fuse方式
  • StripPooling:CVPR2020中核心代码StripPooling
  • GhostModule: CVPR2020GhostNet的核心模块
  • SlimConv: SlimConv3x3
  • Context Gating: video classification
  • EffNetBlock: EffNet
  • ECCV2020 BorderDet: Border aligment module
  • CVPR2019 DANet: Dual Attention
  • Object Contextual Representation for sematic segmentation: OCRModule
  • FPT: 包含Self Transform、Grounding Transform、Rendering Transform
  • DOConv: 阿里提出的Depthwise Over-parameterized Convolution
  • PyConv: 起源人工智能研究院提出的金字塔卷积
  • ULSAM:用于紧凑型CNN的超轻量级子空间注意力模块
  • DGC: ECCV 2020用于加速卷积神经网络的动态分组卷积
  • DCANet: ECCV 2020 学习卷积神经网络的连接注意力
  • PSConv: ECCV 2020 将特征金字塔压缩到紧凑的多尺度卷积层中
  • Dynamic Convolution: CVPR2020 动态滤波器卷积(非官方)
  • CondConv: Conditionally Parameterized Convolutions for Efficient Inference

Evaluation

基于CIFAR10+ResNet+待测评模块,对模块进行初步测评。测评代码来自于另外一个库:https://github.com/kuangliu/pytorch-cifar/ 实验过程中,不使用预训练权重,进行随机初始化。

模型 top1 acc time params(MB)
SENet18 95.28% 1:27:50 11,260,354
ResNet18 95.16% 1:13:03 11,173,962
ResNet50 95.50% 4:24:38 23,520,842
ShuffleNetV2 91.90% 1:02:50 1,263,854
GoogLeNet 91.90% 1:02:50 6,166,250
MobileNetV2 92.66% 2:04:57 2,296,922
SA-ResNet50 89.83% 2:10:07 23,528,758
SA-ResNet18 95.07% 1:39:38 11,171,394

Contribute

欢迎在issue中提出补充的文章paper和对应code链接。

About

👊 计算机视觉中用到的注意力模块和其他即插即用模块PyTorch Implementation Collection of Attention Module and Plug&Play Module

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 86.6%
  • JavaScript 9.1%
  • CSS 4.0%
  • Other 0.3%