-
Beijing Jiaotong University
- Beijing, China
Block or Report
Block or report liguge
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuse注意力
A PyTorch implementation of the Transformer model in "Attention is All You Need".
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
Practice on cifar100(ResNet, DenseNet, VGG, GoogleNet, InceptionV3, InceptionV4, Inception-ResNetv2, Xception, Resnet In Resnet, ResNext,ShuffleNet, ShuffleNetv2, MobileNet, MobileNetv2, SqueezeNet…
Dual Attention Network for Scene Segmentation (CVPR2019)
Summary of related papers on visual attention. Related code will be released based on Jittor gradually.
PyTorch code for our ECCV 2018 paper "Image Super-Resolution Using Very Deep Residual Channel Attention Networks"
Visualizing RNNs using the attention mechanism
code and trained models for "Attentional Feature Fusion"
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
The official pytorch implemention of our ICML-2021 paper "SimAM: A Simple, Parameter-Free Attention Module for Convolutional Neural Networks".
Codes for TS-CAM: Token Semantic Coupled Attention Map for Weakly Supervised Object Localization.
Official code of ICCV2021 paper "Residual Attention: A Simple but Effective Method for Multi-Label Recognition"
The official implementation of ELSA: Enhanced Local Self-Attention for Vision Transformer
This is an unofficial implementation of BOAT: Bilateral Local Attention Vision Transformer
Pytorch!!!Pytorch!!!Pytorch!!! Dynamic Convolution: Attention over Convolution Kernels (CVPR-2020)
A method to increase the speed and lower the memory footprint of existing vision transformers.